User:Tom.naps

From Foss2Serve
Revision as of 01:49, 7 February 2017 by Clif.kussmaul (Talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Contents

Tom Naps

For over 20 years Tom Naps was the primary Computer Science instructor at a small liberal arts college (Lawrence University in Appleton WI) where the Computer Science concentration was integrated into a Mathematics major. Currently he is completing his 14th year in a seven-member Computer Science Department at the University of Wisconsin Oshkosh.

Tom is an active contributor to two large open-source projects in algorithm visualization. The first of these, JHAVE, delivers instructional visualizations on algorithms that span the entire curriculum. It relies on a client-server architecture, with both the client and server written in Java. Since 2011 he has been actively involved in the OpenDSA (Open Data Structures and Algorithms) project hosted at Virginia Tech. The client software for OpenDSA is developed in JavaScript, and that is where Tom has focused his efforts. The server back-end is currently written in Python using the Django framework.

At UWOshkosh, Tom has taught most of the courses in the curriculum. In particular he often teaches a project-oriented Software Engineering II course, and it is in that course that he hopes to incorporate HFOSS.

Part A Activities

Intro IRC Activity, Answers to Part 1 questions

  • How do people interact? Conversationally, using short phrases with the emphasis on communicating the essential meaning of what they intend to say instead of on the grammatical correctness
  • What is the pattern of communication? At least in the sample meeting, I would characterize the pattern as an initial session in which the participants reported on progress made and problems encountered since the last meeting. Each reported item was followed by a brief discussion. This initial session was followed by a wrap-up session in which goals were established to hopefully be reached before the next meeting.
  • Are there any terms that seem to have special meaning? The meetbot tags (indicated by # and the green font) help to organize the way one reads the transcript of the meeting. In a sense they breakdown the meeting into a hierarchical structure -- with #topics broken down into #info and #action items.
  • Can you make any other observations? I would think that the transcript of the meeting is most useful if one of the participants uses it to write up a set of minutes for the meeting as soon as possible after the meeting concludes. That set of minutes then becomes the official record of the meeting, providing a much more readable form than the transcript itself.

Intro to IRC Activity Part 3 – Join and Observe Channel Discussion

I joined and observed the discussion at the #OpenMRS IRC chat. Perhaps not unexpectedly, this was much more freeform than the organized meeting, the logs of which we read and wrote about in the answers to the Part 1 questions. There was no organization of topics using the #topic tag that we had seen in the logs of the session from Part 1. Instead many of the posts to the OpenMRS chat were just (apparently automated) notices from "OpenMRSBot" that a member of the project had posted an issue/update at the project's repository. So in that sense, listening in on the #OpenMRS channel would help to keep me posted on the most recent activity at the repository. Other entries at the #OpenMRS IRC chat seemed to be questions from various participants of the form "How do I ...?" There were many more questions posed than answers given, although in some cases respondents to the question would provide a link the questioner could chase down to at least find resources that might help.

Project Anatomy Activity - Sugar Labs

  • Community - Follow the 'Contacts' link (found in the green option bar) for each of the following teams and summarize the information you find there. For example, are there any commonalities? Is there something distinct for each type of team?
  1. Activity Team - This team is responsible for developing the learning activities for students (that is, the things that students do to foster learning) that consequently run on the Sugar platform
  2. Development Team - This team builds the software for the infrastructure on which the activities (see previous team) run
  3. Documentation Team - This team is responsible for developing all the documentation for the Sugar project. This includes documentation for both using and developing activities as well as the API documentation for the infrastructure platform. Hence various members of the documentation team must understand what the Activity team and Development team do without necessarily being aware of the lower level details of how it is done.

In very broad terms, the Development team focuses on coding, the Activity team focuses on authoring educational materials that use the platform, and the Documentation team must provide manuals that help new contributors plug their efforts into the other teams.

  • Tracker - Indicate the types/categories of tickets listed on this page as well as the information available for each ticket.

Each ticket includes a Summary, the Status of the ticket (new, assigned, re-opened), the Owner of the ticket, the Type (defect or enhancement), the Priority (urgent, high, normal, and a Milestone (most of which show as "Unspecified")

  • Repository -- http://git.sugarlabs.org/sugar-base Can you determine from the information provided here whether the project uses a web-based common repository or a local repo?

There's an indication at the bottom of the repository page that it is powered by "Gitorius" at gitorius.org. However, following the link to http://gitorius.org merely leads to a page indicating that all gitorius repositories are being migrated to http://archive.org. That Sugar is using git (a distributed version control system) is an indication that each user maintains a local copy of the repository with changes ideally ultimately pushed to the master repository which would live at archive.org (wherever that is). However, that status of the master repository seems somewhat dubious, since the phrasing at gitorius.org indicated only that "The repositories (being migrated) will soon be available for read-only access". The last push to this master repository seemed to take place in March 2014. Right now it seems to be a bit unclear how one would acquire the Sugar base to work from.

  • Release cycle -- Information about Sugar's release cycle and roadmap can be found here. Include an entry on your wiki page that describes how the release cycle and roadmap update are related.

Each release cycle apparently includes four sub-releases -- the development, beta, release candidate and final releases.

The Development Team's Roadmap is updated at the beginning of each release cycle by the release team. It includes

  1. Detailed schedule of release dates and freeze points.
  2. List of modules and external dependencies.
  3. Reference to all the tickets considered for the release.
  4. References to the new feature proposals.

However, triggering the link to the Roadmap at the Sugar Labs wiki presently leads to an empty page, apparently indicating development has gone into a bit of a dormant state for the project

Project Anatomy Activity - Sahana

  • Community -- Follow the links to each of the groups listed below and summarize the information you find there on your faculty wiki page. For example, are there any commonalities? Is there something distinct for each type of contributor? How is this structure different than the one you found on the Sugar Labs website?
  1. Developers -- This can be in the form of contributing code, but also in becoming involved in what the project calls "blueprints. These blueprints apparently comprise the software design, as opposed to what the "designers" do in the project (described below)
  2. Testers -- The project describes testers as "non-technical users" who want to contribute to the project by doing quality assurance.
  3. Designers -- This refers to "graphics designers", not design from the perspective of software engineering. That being said, there is a caveat that ultimately it would be best if the "designers" could contribute their work in the form of HTML and CSS
  • Tracker -- How is the information here different than the information found on the Sugar Labs tracker page?

Indicate the types/categories of tickets listed on this page as well as the information available for each ticket.

Tickets are characterized by a Summary of the problem, the software Component to which the issue belongs, the Version (trunk or other branch), the Priority (major or minor), the Type (bug or enhancement), the Owner, the Status (new, accepted,assigned), Created (date reported). The information here is similar to that on the Sugar Labs tracker page. However, the Sahana project then goes on to offer a much deeper level characterization of the issues (e.g., "Easy Bugs for Beginners").

  • Repository -- http://eden.sahanafoundation.org/wiki/InstallationGuidelines The installation guidelines begin here with the option to specify your operating system. For this exercise, choose Linux, then Developer, and finally Manually. At the bottom on the page click InstallationGuidelines/Developer/PostPython. Can you determine from the information provided here whether the project uses a web-based common repository or a local repo?

The main repository is at github. Hence, like Sugar Labs, the use of git version control implies a distributed version control system in which each developer has their own local copy of the repo with committed changes occurring in that local copy. Developers with write access to the main repository at github would then have to push their locally committed changes to the main repository at github.

  • Release cycle -- The Roadmap for each version consists of a collection of what Sahana calls "milestones". Versions listed there are 0.9, 1.0, 2.0.

Part B Activities

FOSS Field Trip Part 1 SourceForge

  • How many projects are there in this category? I chose the category "visualization" since algorithm visualization is an area of great interest to me. There are 618 projects in this area.
  • How many different programming languages are used to write software in this category? 15
  • List the top four programming languages used to write programs in this category. C++ (179), Java (156), C (103), Python (90)
  • Identify the meaning of each of the statuses below:
    • Inactive - obvious, so old as to no longer be of much practical use
    • Mature - on the edge of becoming obsolete
    • Production/Stable - highly usable and highly reliable
    • Beta - close to being ready for prime time
    • Alpha - works, but potentially quite a few bugs
    • Pre-Alpha - works (sort of), but too many bugs for any practical use
    • Planning - not far beyond the brainstorming stage
  • Compare two projects in this category that have two different statuses. Describe the differences between the statuses. Pyx is a project with alpha status. PyMOL Molecular Graphics System is a project with production/stable status. Because of its production/stable status, PyMOL is much more like to be used.
  • Which projects are the most used? How do you know? Freeplot and Matplotlib based on sorting by most downloads
  • Pick a project in your category. TULIP
    • What does it do? Visualization of relational data
    • What programming language is the project written in? C++
    • Who is likely to use the project? How do you know this? Someone who has complex data in a RDB and want to use visualization to enhance their understanding of the data
    • When was the most recent change made to the project? April 19, 2015
    • How active is the project? How can you tell? Quite active, with almost daily commits in 2015 from at least one of the developers
    • How many committers does the project have? The project lists six contributors
    • Would you use the project? Why or why not? Yes, if I had a large amount of data stashed in a RDB, and I was having trouble analyzing patterns in that data through standard SQL queries

FOSS Field Trip Part 2 OpenMRS at openhub.net

  • What is the main programming language used in OpenMRS? Java
  • How many lines of code does OpenMRS have? 3.87 million lines of code (or 6.1 million lines of code -- found conflicting information)
  • List some of the locations of the developers. United States, United Kingdom, France, South Africa, Sri Lanka
  • How many languages is OpenMRS written in? 15
  • What language has the second highest number of lines of code? Javascript
  • Of the programming languages used in OpenMRS , which language the has the highest comment ratio? Java (35.3%)
  • What is the average number of contributors in the last 12 months? 7 contributor
  • Scroll down to the Top Contributors section. How long have the top three contributors been involved in the project? 5 years (Jeremy Kelper), 3 years (Elliot Williams), 7 years (djazayeri)
  • Use the information on the project summary page to compute the 12-month average of commits. What is the average number of commits over the past 12 months?. 64

Project evaluation activity

Time prohibited me from evaluating the (optional) secondary criteria for OpenMRS. However, my rating on the mission critical criteria already resulted in an accumulation of 14 points, so it appears like the cutoff of 20 would be reached upon a completed evaluation of all secondary criteria.

FOSS in Courses Activity 1

The class in which I hope to use OpenMRS is Software Engineering II. Students in this class have typically had at least three previous Computer Science classes -- CS 1 and 2 (taught in Java) and Software Engineering I (a course in which they worked on teams developing a project of their own design in .NET). The SE II course is thus a course where we can expect our students to be pretty good programmers, and, as such, there is a heavy emphasis on doing actual software development in the course. Hence my goal isn't to introduce beginning students to the concept of FOSS but rather to provide a rigorous software development challenge to more advanced students. Consequently I need to lead my students through a progression of milestones where, by the end of the semester, they will have actually written a module that could actually become a real and useful component of the OpenMRS code base -- if it is done well enough. What should that progression of milestones be? This is where I still need to do a lot of exploring and will rely on advice from other POSSE participants who have had their students work in the OpenMRS code base. Right now I envision something like this:

  • Milestone 1. Students get the OpenMRS source code, install the development environment, read OpenMRS's 74-page "Developer's Guide", and create what the guide calls a HelloWorld MRS module. There will be a lot for them to digest here, so this is not something that will happen in the course of a couple of days, but rather more like a couple of weeks.
    • Question: Should they be installing the OpenMRS SDK or the Standalone version.
  • Milestone 2. OpenMRS evidently uses JUNIT test cases. My students will not have used JUNIT before this course. So have them explore the existing JUNIT test cases for an area within OpenMRS and then write additional test cases in an area where the existing test cases do not seem sufficient. Goal here is to write test cases that make a component fail more than to write test cases that all succeed.
  • Milestone 3. Examine the list of "introductory tickets" on the OpenMRS developer wiki. Have then claim one of these tickets and fix the problem.
  • Milestone 4. Select one of the "unassigned projects" on the OpenMRS wiki. Have them try to carry through work on one of these projects. I suspect it will be difficult for them to carry such a project through all the way to completion in one semester, when added to the other milestones I have already listed? Another worry, the unassigned projects page seems to not have been updated since June 2012, so are any of these projects still realistically feasible?

Part C Activities

Bug Tracking Part 1 Bug Reports

  • Define what each of the column names below indicate. Include the range of possible values for 2-7 below. Feel free to explore beyond the page to find more information.
    • ID - the unique identifier of the bug
    • Sev - How severe the bug is, or whether it's an enhancement (critical, major, normal, minor, trivial,enhancement)
    • Pri - Engineers prioritize their bugs using this field (immediate, urgent, high, normal, low)
    • OS - operating system(s) on which the bug was observed (Advanced search would allow me to choose from 21 options)
    • Product - the particular software artifact in which the bug appears (types of artifacts fit into bindings, core, infrastructure, applications, platform, other)
    • Status - The Status field indicates the current state of a bug (unconfirmed, confirmed, in progress, resolved, verified)
    • Resolution - The Resolution field indicates what happened to this bug (open, fixed, invalid, wontfix, duplicate, worksforme)
    • Summary - The bug summary is a short sentence which succinctly describes what the bug is about
  • Describe how you discovered the definitions and how did you find the information from above - I used both the Reports link and Advanced search
  • Identify the order in which the bugs are initially displayed? By product and component within product
  • What is the meaning of the shading of some bug reports? The green or gold shading appears to be an indication of whether or not more bugs were opened than closed for a particular module
  • What is the meaning of the colors used when describing a bug (red, gray, black)? Wasn't able to track this down but am assuming it is somehow ties to urgency of the problem, with red being assigned to very few problems. However those that are red appear to severly inhibit use of that particlar artifact in the system
  • Select a bug that you think that you might be able to fix and look at it more closely (click on the bug number). 517888 - Missing events of buttons 4 and 5
  • Identify when the bug was submitted. 2008-02-21 15:18 UTC by Flavio Percoco Premoli
  • Identify if there has been recent discussion about the bug? Most recent were in 2014
  • Is the bug current? Apparently not
  • Is the bug assigned? To whom? I do not think so. The most recent commenter on the bug, Andre Klapper in 2013 was asking for someone to assign themselves to the bug.
  • Describe what you would need to do to fix the bug. Learn more about the event-handling system.
  • Repeat the previous step with a different kind of bug.
    • Select a bug that you think that you might be able to fix and look at it more closely (click on the bug number). 745393 - World clocks buttons seem not accessible in the calendar/notication area
    • Identify when the bug was submitted. 2015-03-01 23:27 UTC by Juanjo Marín
    • Identify if there has been recent discussion about the bug? Most recent were the first four days of March, 2-15
    • Is the bug current? In the sense of not being resolved, apparently yes
    • Is the bug assigned? To whom? I do not think so. The most recent commenter on the bug was Florian Muellner
    • Describe what you would need to do to fix the bug. Better learn the accessible names of GUI components in the related application

Bug Tracking Part 2 Collective Reports

  • How many bug reports were opened in the last week? 329
  • How many were closed? 262
  • What was the general trend last week? Were more bugs opened than closed or vice versa? More opened than closed
  • Who were the top three bug closers? Why is this important to know? Matthias Clasen, Bastien Nocera, Carlos Soriano. They may be able to provide the most informed types of insight into various problems.
  • Who were the top three bug reporters? Are these the same as the top three bug closes? What is the overlap in these two lists? Bastien Nocera, Andreas Nilsson 8, m.rick.mac. One overlap
  • Who are the top three contributors of patches? Carlos Soriano, Sebastian Dröge, Bastien Nocera
  • Who are the top three reviewers of patches? Sebastian Dröge, Bastien Nocera, Matthias Clasen
  • What is the overlap between these lists and the bug closers and bug reporters? What is the overlap between patch contributors and patch reviewers? All three of the reviewers of patches overlap with the other lists
  • Plot a graph of the severity of bugs by component for Orca:
  • What class were the majority of the bugs for braille? Normal (126)
  • What other reports can you generate? Line graphs, pie charts, tables, csv

FOSS in Courses Activity 2

What I've done here is re-create the four project "milestones" I listed above in "FOSS in Courses Activity 1". Below each of the milestone, I've then created second-level bullets for the points we are asked to address in this activity. The third-level bullets are then my reactions/comments to those bullets.

Re-produced from "FOSS in Courses Activity 1":The class in which I hope to use OpenMRS is Software Engineering II. Students in this class have typically had at least three previous Computer Science classes -- CS 1 and 2 (taught in Java) and Software Engineering I (a course in which they worked on teams developing a project of their own design in .NET). The SE II course is thus a course where we can expect our students to be pretty good programmers, and, as such, there is a heavy emphasis on doing actual software development in the course. Hence my goal isn't to introduce beginning students to the concept of FOSS but rather to provide a rigorous software development challenge to more advanced students. Consequently I need to lead my students through a progression of milestones where, by the end of the semester, they will have actually written a module that could actually become a real and useful component of the OpenMRS code base -- if it is done well enough. What should that progression of milestones be? This is where I still need to do a lot of exploring and will rely on advice from other POSSE participants who have had their students work in the OpenMRS code base. Right now I envision something like this:

  • Milestone 1. Students get the OpenMRS source code, install the development environment, read OpenMRS's 74-page "Developer's Guide", and create what the guide calls a HelloWorld MRS module. There will be a lot for them to digest here, so this is not something that will happen in the course of a couple of days, but rather more like a couple of weeks.
    • Identify some possible learning outcomes that should be fulfilled with the activities/task.
      • Learning how to read through the documentation associated with a FOSS project
    • Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list.
      • Familiarity with version control and git in particular
    • Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule.
      • Two weeks. Could be completed independently of HFOSS community schedule
    • Think about possible input required from the HFOSS community. How much input is required and what kind?
      • Help on knowing the consequence of installing and working with the OpenMRS SDK versus the Standalone version.
    • If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness.
      • The hope would be to develop a condensed guide that other in CSEd could use
    • Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community?
      • Individual activity. Success is the student's demonstrating a successful installation on their workstation
    • List any questions or concerns that you have about the activity/task.
      • See question above about whether to use OpenMRS SDK or Standalone version
        • From a more careful read of the documentation that I did when I tried an install before traveling to Raleigh, I now see that, if students are actually to develop with the existing source code, then neither the SDK nor the Standalone version will suffice. They will have to do a complete manual install. However, I believe it is the case that, if a student were only asked to develop a brand new module to "plug into" the existing code without modifying that existing code, then they could get by with the SDK.
  • Milestone 2. OpenMRS evidently uses JUNIT test cases. My students will not have used JUNIT before this course. So have them explore the existing JUNIT test cases for an area within OpenMRS and then write additional test cases in an area where the existing test cases do not seem sufficient. Goal here is to write test cases that make a component fail more than to write test cases that all succeed.
    • Identify some possible learning outcomes that should be fulfilled with the activities/task.
      • Demonstrate enough understanding of the component to be able to formulate detailed test cases
    • Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list.
      • Ability to create test cases with JUNIT
    • Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule.
      • Three weeks. Could be completed independently of HFOSS community schedule
    • Think about possible input required from the HFOSS community. How much input is required and what kind?
      • Advice from those who have already gotten their head around some of the details of how OpenMRS employs JUNIT testing
    • If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness.
      • Here I would hope to have students work in teams of two. Each team will be responsible for developing test cases for a different component of OpenMRS. So in the end, my students' work would provide demonstrations of the benefits and pitfalls of working with JUNIT in a variety of OpenMRS components
    • Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community?
      • Thoroughness of test cases are essential. Somehow I would like to tie the students' grades to the number of failed test cases they are able to produce.
    • List any questions or concerns that you have about the activity/task.
      • Any precedent from previous HFOSS members who are attempted something similar?
  • Milestone 3. Examine the list of "introductory tickets" on the OpenMRS developer wiki. Have then claim one of these tickets and fix the problem.
    • Identify some possible learning outcomes that should be fulfilled with the activities/task.
      • Here the students would need to get much deeper into the internals of the particular OpenMRS software components associated with the ticket, so more detailed reading of documentation along with starting to tinker with the actual OpenMRS code.
    • Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list.
      • Hopefully nothing more than that previous two milestones
    • Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule.
      • Three weeks. I think the activity can be completed independently. Although in a recent HFOSS IRC chat, someone suggested the possibility of students at different schools working on cross-institutional teams. I like this idea, but clearly it would require careful coordination.
    • Think about possible input required from the HFOSS community. How much input is required and what kind?
      • Feedback on whether others who have gotten their heads around the OpenMRS project have had their students work on such an open ticket.
    • If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness.
      • Clearly fixing an open ticket would help improve the OpenMRS project's usability among its clientele
    • Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community?
      • One aspect is whether the students actually succeed in fixing the problem described in the ticket. If they do, then declare "complete success". If they don't, I need to develop criteria for partial success. Perhaps having them keep a log of their attempts and my grading of the log?
    • List any questions or concerns that you have about the activity/task.
      • See above -- how to grade students who don't fully succeed in carrying out the assigned task. Have other tried the suggestion I offer there regarding having students keep a detailed log of their work and grading that log as a measure of how well they plugged into the milestone and how hard they worked on it?
  • Milestone 4. Select one of the "unassigned projects" on the OpenMRS wiki. Have them try to carry through work on one of these projects. I suspect it will be difficult for them to carry such a project through all the way to completion in one semester, when added to the other milestones I have already listed? Another worry, the unassigned projects page seems to not have been updated since June 2012, so are any of these projects still realistically feasible?
    • Identify some possible learning outcomes that should be fulfilled with the activities/task.
      • This milestone would clearly require the most knowledge and familiarity of OpenMRS of all the milestones I have described
    • Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list.
      • I think/hope nothing beyond completion of the previous three milestones?
    • Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule.
      • Six weeks. Again I don't think synchronization is necessary unless my students are working in conjunction with those at a different institution
    • Think about possible input required from the HFOSS community. How much input is required and what kind?
      • Feedback from others who may have ad their students attempt something this extensive
    • If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness.
      • Completing an unassigned project at the level where the OpenMRS community would accept it would clearly be beneficial to the project. But see next third-level bullet ...
    • Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community?
      • Completing an unassigned project to the point where it is accepted by the OpenMRS community may be a goal that none of my students can achieve in the course of a six-week milestone. So again I need to develop a rubric for grading where their not reaching that goal does not result in their necessarily receiving a bad grade. Here again as I discussed under milestone 3, my current thinking is to have each group a students keep a careful log of their work so I can grade the progress they self-document in that log.

An extremely valuable outcome from the POSSE meeting for me would be to work with a group of other POSSE participants to develop a virtual machine using VMWare or Virtualbox that was "ready" for OpenMRS development. We could give that VM to our students so they could hit the ground running in working on the OpenMRS project.

Personal tools
Namespaces
Variants
Actions
Events
Learning Resources
HFOSS Projects
Evaluation
Navigation
Toolbox