User:Cmurphy

(Difference between revisions)
Jump to: navigation, search
Line 115: Line 115:
  
 
5. this bug describes an ambiguity in the doc; I'd need to make sure that the description is unambiguous using the terminology expected by the intended audience
 
5. this bug describes an ambiguity in the doc; I'd need to make sure that the description is unambiguous using the terminology expected by the intended audience
 +
 +
====Part 2. Collective Reports====
 +
 +
2. How many bug reports were opened in the last week? How many were closed? -- 325 reports opened and 386 reports closed.
 +
 +
3. What was the general trend last week? Were more bugs opened than closed or vice versa? -- more closed than opened
 +
 +
4. Who were the top three bug closers? Why is this important to know? -- Jean-François Fortin Tam, Bastien Nocera, Matthias Clasen -- perhaps they are the ones who know the code best
 +
 +
5. Who were the top three bug reporters? Are these the same as the top three bug closes? What is the overlap in these two lists? -- Jo, Jean-François Fortin Tam, Michael Catanzaro -- not too much overlap except for Jean-Francois
 +
 +
6. Who are the top three contributors of patches? -- Ray Strode [halfline], Bastien Nocera, Marcos Chavarria Teijeiro
 +
 +
7. Who are the top three reviewers of patches? What is the overlap between these lists and the bug closers and bug reporters? What is the overlap between patch contributors and patch reviewers? -- Sebastian Dröge (slomo), Florian Müllner, Jonas Danielsson -- there is a little overlap between patch contributors and bug closers, which makes sense because once you patch it, you can mark it closed; not too much overlap between reviewers and contributors, which is to be expected, since the people who contribute a patch should not be reviewing their own work
 +
 +
10. What other reports can you generate? -- tons!

Revision as of 20:47, 7 November 2014

Contents

Chris Murphy

Chris is an Associate Professor of Practice at the University of Pennsylvania.

He is the director of the Masters of Computer & Information Technology program, and teaches graduate and undergraduate software engineering courses. He also oversees Penn's participation in the Facebook Open Academy Program, an academic initiative sponsored by Facebook in which students contribute to open-source projects under the advisement of a professional mentor.

Chris earned a PhD from Columbia University in 2010, and his research focuses on software testing and computer science education.

Stage 1 Activities

Part A: Intro to IRC, Part 1

  • How do people interact? briefly, but politely, and usually offering to help out or at least make helpful suggestions
  • What is the pattern of communication? generally focused on a particular issue: someone raises it and the others try to help out
  • Are there any terms that seem to have special meaning? technical terms, of course, but also the commands to MeetBot
  • Can you make any other observations? amber does not seem to be a big fan of capitalization and punctuation :-p

Part A: Intro to IRC, Part 3

  • I observed the #openMRS channel on Tues Oct 14.
  • There was a "global notice" that went to all freenode users; I didn't realize it at first, as I thought it was just for the channel users, but I see the difference now.
  • There was a conversation between two users in which one was attempting to help the other get the code downloaded and installed.
  • The OpenMRSBot occassionally sent messages based on activities in Jira
  • At 10am EDT, a user started the daily SCRUM meeting. Users were asked to give updates in order, however two of the three were absent
  • I noticed that most of the conversations were between two people, in which they included the other person's nickname at the start of their entry

Part A: Project Anatomy

Sugar Labs

Community
  • Activity Team: develops and maintains many of the activities; there are 2 coordinators and 13 contributors
  • Development Team: build and maintain the core Sugar environment; there is no coordinator and 4 "people" listed; there is no overlap with the Activity team
  • Documentation Team: provide the Sugar community with high quality documentation; no coordinator or contributors are listed
Tracker
  • types: defects and enhancements
  • info available for each ticket: ID#, reported by, owned by, priority, milestone, component, version, severity, keywords, CC, distribution/OS, status, description, attachments, change history
Repository

it seems to be a local repository

Release Cycle

The roadmap is updated at the beginning of each release cycle.

Sahana Eden

Community
  • Developers: people who develop the software; names and roles do not seem to be defined, unlike Sugar Labs; rather, there seems to be more "how to get started" info here
  • Testers: non-technical users who do QA through manual testing; there are links for documenting test cases, as well as links for developers
  • Designers: people who work on the user interface

Tracker

  • types: defect/bug, documentation, enhancement, task
  • info available for each ticket: ID#, reported by, owned by, priority, milestone, component, version, keywords, CC, due date, launchpad bug, description, attachments, change history
  • this is different from Sugar Labs because the tickets are organized into reports, rather than just presenting one large list

Repository

since this is hosted on github, it is a shared/web repository

Release Cycle

The roadmap/milestones seems to be based on the completion of features and not a specific date-driven release cycle.

Part B: Project Evaluation Activity

File:Mifos Evaluation Template.xlsx

Part B: FOSS in Courses Planning 1

Step 3.

For my undergraduate software engineering course, the motivation is to give students experience working with a large code base, and to get them thinking about the design of software. So I am interested in having the students add functionality to the project (along with corresponding test cases) but also to think about how they’re identifying components, the relationship between those components, etc.

For my graduate software engineering course, the emphasis is on “what is good code?” so we spend a lot of time reviewing code and figuring out ways to improve it. So the HFOSS-related activities would include conducting code inspections and then refactoring code to improve its design and internal quality, as well as bug fixing and regression testing.

Step 4.

For the undergraduate course, the activities would be based on the proposed CS2 assignments from the “50 ways to be a FOSSer” blog. I would have the students look at the existing code and document the design using UML, and also ask them to identify the usage (or attempted usage) of the design patterns we study in class. Then I would have them propose a feature, design the components using appropriate patterns, and then implement and test the feature.

For the graduate course, I would use some combination of the “Quality & Testing” and “Coding & Style” activities from the same blog. Students would start out by choosing one of the open defects, writing a test case that demonstrates that bug, and then fixing the bug. I would also ask them to create additional test cases for that component using test set adequacy metrics and then fix any other bugs they reveal. Then, once they’re comfortable with the expected functionality, they would conduct a code inspection, document “code smells”, and then refactor the code to improve its quality.

Part C: Bug Tracker Activity

Part 1. Bug Reports

1. Define what each of the column names below indicate. Include the range of possible values for 2-7 below. Feel free to explore beyond the page to find more information.

1. ID: a unique ID for each bug

2. Sev: the severity of the bug; normal, minor, major, critical, enhancement, blocker, trivial

3. Pri: priority; low, normal, high, urgent

4. OS: operating system; all, Linux, open, Windows, Solaris, Mac, other

5. Product: which product the bug affects; lots of possible values

6. Status: current status of the bug; unconfirmed, new, assigned, reopened, need info

7. Resolution: no possible values listed; maybe a link to a description of how it was resolved?

8. Summary: summary of the bug

2. Describe how you discovered the definitions and how did you find the information from above (hint: the advanced search shows the options or the Reports link has a link)? I sorted each category to see the possible options, and then opened up a bug report and read the longer description to see what was there

3. Identify the order in which the bugs are initially displayed? seems like they're sorted by status

4. What is the meaning of the shading of some bug reports? hmmm… I can't tell; seems like every other entry is shaded, unless it's an enhancement

5. What is the meaning of the colors used when describing a bug (red, gray, black)? red is for blocker or critical status; gray is for enhancements

6. Select a bug that you think that you might be able to fix and look at it more closely (click on the bug number). -- I chose 561837

1. Identify when the bug was submitted. -- 2008-11-21

2. Identify if there has been recent discussion about the bug? -- not since 2013-08-14

3. Is the bug current? -- doesn't seem like it

4. Is the bug assigned? To whom? -- to At-spi maintainer(s)

5. Describe what you would need to do to fix the bug. -- it looks like someone created a patch but no one confirmed it, so I guess I'd need to know whether the patch is okay

7. Repeat the previous step with a different kind of bug. -- I chose 437375

1. submitted 2007-05-10

2. no recent discussion since 2011-06-23

3. doesn't seem current

4. assigned to ATK maintainer(s)

5. this bug describes an ambiguity in the doc; I'd need to make sure that the description is unambiguous using the terminology expected by the intended audience

Part 2. Collective Reports

2. How many bug reports were opened in the last week? How many were closed? -- 325 reports opened and 386 reports closed.

3. What was the general trend last week? Were more bugs opened than closed or vice versa? -- more closed than opened

4. Who were the top three bug closers? Why is this important to know? -- Jean-François Fortin Tam, Bastien Nocera, Matthias Clasen -- perhaps they are the ones who know the code best

5. Who were the top three bug reporters? Are these the same as the top three bug closes? What is the overlap in these two lists? -- Jo, Jean-François Fortin Tam, Michael Catanzaro -- not too much overlap except for Jean-Francois

6. Who are the top three contributors of patches? -- Ray Strode [halfline], Bastien Nocera, Marcos Chavarria Teijeiro

7. Who are the top three reviewers of patches? What is the overlap between these lists and the bug closers and bug reporters? What is the overlap between patch contributors and patch reviewers? -- Sebastian Dröge (slomo), Florian Müllner, Jonas Danielsson -- there is a little overlap between patch contributors and bug closers, which makes sense because once you patch it, you can mark it closed; not too much overlap between reviewers and contributors, which is to be expected, since the people who contribute a patch should not be reviewing their own work

10. What other reports can you generate? -- tons!

Personal tools
Namespaces
Variants
Actions
Events
Learning Resources
HFOSS Projects
Evaluation
Navigation
Toolbox