User:Cmurphy

From Foss2Serve
Jump to: navigation, search

Chris Murphy

I am an Associate Professor of Practice in the Dept. of Computer & Information Science at the University of Pennsylvania: http://www.seas.upenn.edu/~cdmurphy

I first attended POSSE in November 2014 and have since attended three more, as well as two POSSE Roundup pre-symposium events at SIGCSE.

My current interests include online education, student contributions to open source software projects, and how these affect diversity and inclusion within CS.

Prior to joining Penn in 2010, I completed a PhD in Computer Science at Columbia University, where my research focused on software testing. Before that, I worked as a professional software developer in Boston, San Francisco, and London after earning a BS in Computer Engineering from Boston University.

Somewhere along the way, I also spent two years teaching English in Seoul, but that's not really part of the narrative hahaha...

I have occasionally taught a standalone course on open-source software development at Penn, though unfortunately I have not been able to teach it since Fall 2016.

I have a few publications and presentations regarding open source software, including:

  • "Bridging the Diversity Gap in Computer Science with a Course on Open Source Software", J. Weng and C. Murphy, In Proc of the 3rd Annual IEEE STCBP Conference on Research on Equity & Sustained Participation in Engineering, Computing, and Technology (RESPECT), Baltimore MD, Feb 2018.
  • "Addressing Diversity & Inclusion Issues in Computer Science through Contributions to Free and Open Source Software" (Birds of a Feather session with J. Weng, N. Veilleux, and J. Pearce), 2017 ACM Richard Tapia Celebration of Diversity in Computing, Atlanta GA, Sept 21, 2017.
  • "Community Engagement with Free and Open Source Software" (panel moderator), 48th ACM SIGCSE Technical Symposium on Computer Science Education, Seattle WA, Mar 9, 2017.

You can find out more in my CV and on my LinkedIn page!

POSSE 11-2014 Stage 1 Activities

Part A: Intro to IRC, Part 1===

  • How do people interact? briefly, but politely, and usually offering to help out or at least make helpful suggestions
  • What is the pattern of communication? generally focused on a particular issue: someone raises it and the others try to help out
  • Are there any terms that seem to have special meaning? technical terms, of course, but also the commands to MeetBot
  • Can you make any other observations? amber does not seem to be a big fan of capitalization and punctuation :-p

Part A: Intro to IRC, Part 3===

  • I observed the #openMRS channel on Tues Oct 14.
  • There was a "global notice" that went to all freenode users; I didn't realize it at first, as I thought it was just for the channel users, but I see the difference now.
  • There was a conversation between two users in which one was attempting to help the other get the code downloaded and installed.
  • The OpenMRSBot occassionally sent messages based on activities in Jira
  • At 10am EDT, a user started the daily SCRUM meeting. Users were asked to give updates in order, however two of the three were absent
  • I noticed that most of the conversations were between two people, in which they included the other person's nickname at the start of their entry

Part A: Project Anatomy=== Sugar Labs==== Community=====

  • Activity Team: develops and maintains many of the activities; there are 2 coordinators and 13 contributors
  • Development Team: build and maintain the core Sugar environment; there is no coordinator and 4 "people" listed; there is no overlap with the Activity team
  • Documentation Team: provide the Sugar community with high quality documentation; no coordinator or contributors are listed

Tracker=====

  • types: defects and enhancements
  • info available for each ticket: ID#, reported by, owned by, priority, milestone, component, version, severity, keywords, CC, distribution/OS, status, description, attachments, change history

Repository===== it seems to be a local repository Release Cycle===== The roadmap is updated at the beginning of each release cycle.

Sahana Eden==== Community=====

  • Developers: people who develop the software; names and roles do not seem to be defined, unlike Sugar Labs; rather, there seems to be more "how to get started" info here
  • Testers: non-technical users who do QA through manual testing; there are links for documenting test cases, as well as links for developers
  • Designers: people who work on the user interface

Tracker====

  • types: defect/bug, documentation, enhancement, task
  • info available for each ticket: ID#, reported by, owned by, priority, milestone, component, version, keywords, CC, due date, launchpad bug, description, attachments, change history
  • this is different from Sugar Labs because the tickets are organized into reports, rather than just presenting one large list

Repository==== since this is hosted on github, it is a shared/web repository Release Cycle==== The roadmap/milestones seems to be based on the completion of features and not a specific date-driven release cycle.

Part B: Project Evaluation Activity=== File:Mifos Evaluation Template.xlsx

Part B: FOSS in Courses Planning 1=== Step 3.==== For my undergraduate software engineering course, the motivation is to give students experience working with a large code base, and to get them thinking about the design of software. So I am interested in having the students add functionality to the project (along with corresponding test cases) but also to think about how they’re identifying components, the relationship between those components, etc.

For my graduate software engineering course, the emphasis is on “what is good code?” so we spend a lot of time reviewing code and figuring out ways to improve it. So the HFOSS-related activities would include conducting code inspections and then refactoring code to improve its design and internal quality, as well as bug fixing and regression testing.

Step 4.==== For the undergraduate course, the activities would be based on the proposed CS2 assignments from the “50 ways to be a FOSSer” blog. I would have the students look at the existing code and document the design using UML, and also ask them to identify the usage (or attempted usage) of the design patterns we study in class. Then I would have them propose a feature, design the components using appropriate patterns, and then implement and test the feature.

For the graduate course, I would use some combination of the “Quality & Testing” and “Coding & Style” activities from the same blog. Students would start out by choosing one of the open defects, writing a test case that demonstrates that bug, and then fixing the bug. I would also ask them to create additional test cases for that component using test set adequacy metrics and then fix any other bugs they reveal. Then, once they’re comfortable with the expected functionality, they would conduct a code inspection, document “code smells”, and then refactor the code to improve its quality.

Part C: Bug Tracker Activity=== Part 1. Bug Reports==== 1. Define what each of the column names below indicate. Include the range of possible values for 2-7 below. Feel free to explore beyond the page to find more information.

1. ID: a unique ID for each bug

2. Sev: the severity of the bug; normal, minor, major, critical, enhancement, blocker, trivial

3. Pri: priority; low, normal, high, urgent

4. OS: operating system; all, Linux, open, Windows, Solaris, Mac, other

5. Product: which product the bug affects; lots of possible values

6. Status: current status of the bug; unconfirmed, new, assigned, reopened, need info

7. Resolution: no possible values listed; maybe a link to a description of how it was resolved?

8. Summary: summary of the bug

2. Describe how you discovered the definitions and how did you find the information from above (hint: the advanced search shows the options or the Reports link has a link)? I sorted each category to see the possible options, and then opened up a bug report and read the longer description to see what was there

3. Identify the order in which the bugs are initially displayed? seems like they're sorted by status

4. What is the meaning of the shading of some bug reports? hmmm… I can't tell; seems like every other entry is shaded, unless it's an enhancement

5. What is the meaning of the colors used when describing a bug (red, gray, black)? red is for blocker or critical status; gray is for enhancements

6. Select a bug that you think that you might be able to fix and look at it more closely (click on the bug number). -- I chose 561837

1. Identify when the bug was submitted. -- 2008-11-21

2. Identify if there has been recent discussion about the bug? -- not since 2013-08-14

3. Is the bug current? -- doesn't seem like it

4. Is the bug assigned? To whom? -- to At-spi maintainer(s)

5. Describe what you would need to do to fix the bug. -- it looks like someone created a patch but no one confirmed it, so I guess I'd need to know whether the patch is okay

7. Repeat the previous step with a different kind of bug. -- I chose 437375

1. submitted 2007-05-10

2. no recent discussion since 2011-06-23

3. doesn't seem current

4. assigned to ATK maintainer(s)

5. this bug describes an ambiguity in the doc; I'd need to make sure that the description is unambiguous using the terminology expected by the intended audience

Part 2. Collective Reports====

2. How many bug reports were opened in the last week? How many were closed? -- 325 reports opened and 386 reports closed.

3. What was the general trend last week? Were more bugs opened than closed or vice versa? -- more closed than opened

4. Who were the top three bug closers? Why is this important to know? -- Jean-François Fortin Tam, Bastien Nocera, Matthias Clasen -- perhaps they are the ones who know the code best

5. Who were the top three bug reporters? Are these the same as the top three bug closes? What is the overlap in these two lists? -- Jo, Jean-François Fortin Tam, Michael Catanzaro -- not too much overlap except for Jean-Francois

6. Who are the top three contributors of patches? -- Ray Strode [halfline], Bastien Nocera, Marcos Chavarria Teijeiro

7. Who are the top three reviewers of patches? What is the overlap between these lists and the bug closers and bug reporters? What is the overlap between patch contributors and patch reviewers? -- Sebastian Dröge (slomo), Florian Müllner, Jonas Danielsson -- there is a little overlap between patch contributors and bug closers, which makes sense because once you patch it, you can mark it closed; not too much overlap between reviewers and contributors, which is to be expected, since the people who contribute a patch should not be reviewing their own work

10. What other reports can you generate? -- tons!


Part C: FOSS in Courses Planning 2=== Part 1.==== For both my undergraduate and graduate courses, I imagine that the FOSS activities would consist of two homework assignments: one in which the students evaluate the existing code, and then another in which they add to the code base or somehow improve the code.

I can imagine having a lecture around the benefits of FOSS for the undergrads, but I wouldn't think it's necessary for the homework assignments, which are simply focused on "existing code".

As for a project, we typically do customer-focused apps for our local community, and for undergrads the focus is on developing an app from scratch, but if a group of graduate students was particularly interested in contributing to a FOSS project, we could certainly explore that option.

Part 2.==== Learning outcomes: For the undergrads, to be able to document the design (e.g. using UML) of existing code, to convert a set of requirements into a design, and to implement and test the code. For the graduate students, to be able to adequately test existing code, to debug code, and to refactor code.

Pre-requisite knowledge: Aside from the programming language of choice, I don't think there's much in the way of pre-reqs. Part of the challenge, especially for the graduates, would be to figure out the intent of the code as they are working with it.

Time estimates: It may take quite a while (~20 hours) to identify a project, figure out parts of the code to work with, identify the work that can realistically be done, and then write up the assignment and put together a grading rubric. For the undergrads, if we want to contribute new features to the project, we'd definitely need to coordinate that with the community, particularly as we have 130+ students in that class.

Input required for community: If we want to add new features, we will likely need to have those approved/vetted. For things like bug fixing and refactoring, though, I suspect the primary input will be coding conventions.

Assessment: For the undergrads, the documentation of the design of the code can be an individual assignment. If they are to design and implement new features, though, that would probably be in groups, primarily to address issues of scale. I wouldn't want to associate getting the work committed with the grade, though, as that is somewhat out of our control. For the graduates, the assignments (fixing bugs, writing tests, refactoring) could be done in pairs, and I might be more inclined to require that bug fixes or refactoring be committed, since presumably someone in the community would be able to assess their work as "good enough".

Questions/concerns: The more I think about this, the more concerned I get about issues of scale. My undergrad class has over 130 students; the graduate class has around 80. Documenting the design of the existing code is something that scales, since I wouldn't expect that we'd actually contribute that back to the project. But for the undergrads, even if they work in groups of four, what sort of effort will it take on my part to coordinate 30-something new contributions to existing projects? Will the groups all work on distinct features? How would they be graded? It wouldn't really make sense for multiple groups to work on the exact same feature, since only one implementation would be committed. Likewise, for the graduates, do the 80 students work on fixing 80 different bugs? And then refactor 80 different pieces of code? Again, it wouldn't make sense to have them all refactoring the same piece of code.

Personal tools
Namespaces
Variants
Actions
Events
Learning Resources
HFOSS Projects
Evaluation
Navigation
Toolbox