User:BBurd

From Foss2Serve
Jump to: navigation, search

Contents

Barry Burd

Professor of Mathematics and Computer Science at Drew University in Madison, NJ.

Author of Java For Dummies and other books in the For Dummies series.

Leader of the 2017 ITiCSE working group on IoT in Computer Science education.

Dr. Burd is an avid indoor enthusiast. In his spare time, he enjoys sleeping, eating, and talking.

Notes from POSSE assignments

Sugar Labs notes

Roles for participation are Educator, Content Writer, People Person, Developer, Designer, and Translator. Ideally, I'd like my students to be developers. Some of them could be translators because my university has many international students. Some students would fit best in the role of Designer, but I'm not the designer type, so I wouldn't be very useful as a mentor for them in that role.

To submit a bug for Sugar Labs, (1) Find the correct respository (with /sugarlabs/sugar being the default), (2) look for the Issues tab and click the New Issue button, (3) Write a note about the issue. The respositories for issues include, the default (sugar), the toolkit (gtk3), docs, artwork, sugarlabs, and build. I'm not absolutely sure, but I think the difference between "sugar" and "sugarlabs" is as follows: sugar is about the Sugar shell itself; sugarlabs is about the project's web site. Am I correct?

In the Sugar repository, the most recent commit (d3660ac) was 9 days ago (as of today, Oct 14, 2017).

Roadmap and release cycle: The roadmap is the plan for development, which includes release dates, freeze points, lists of module dependencies, and other items. (The Sugar roadmap page is currently empty.) The release cycle is the timing of releases. Each release includes development, beta, release candidate, and the final release.

Sahana Eden notes

Roles for participation are Developer, Tester, Bug Marshal, Newsletter Report Writer, Documentor, Translator, Designer, SysAdmin and GIS Specialist. Developer, designer, and translator are also in the Sugar Labs project. Now I'm noticing that Sugar labs doesn't have Tester, which would seem to be an important role. Sahana is heavy on testing because, in addition to Tester, Sahana also has a Bug Marshal role. In Sahana, there's also a distinction between Documentor and Newsletter Report writer. Interesting! Sugar Labs has an Educator role and Sahana doesn't but Sahana has this Newsletter Writer.

Superficial observation: Sugar uses Git for listing its issues; Sahana seems to use a page of its own devising. Another observation (probably also superficial) is that the Sahana report page is tree-shaped - with summaries at the root branching out to individual issues. The Sugar Labs page is flat so it has only the issues themselves.

Sahana Active Tickets page includes 141 tickets. They're grouped into major, minor and trivial. They're also classified as enhancement, defect/bug, documentation, and task. Information for each issue also includes a Summary, which component is involved, version (trunk or test) the issue's owner (very important), the issue's status and the date when the issue was created. Drilling down into an issue, I see a description of the issue with actual result versus expected result, attachments and a change history.

On the Sahana repository page, the most recent commit was today (October 15). It's coded d9f2502.

Ouch! The roadmap page says Milestone 0.9.0 is 6 years late! Features are listed for Milestones 1.0 and 2.0 but no dates have been set for those milestones.

GitHub/OpenHub notes

On Github, searching for Education, I find 15,901 repository results. The first result (nodejs/education) has tabs for Code, Issues, Pull requests, Projects, Wiki, and Insights. The Code page has links to the .md files and a copy of the Readme.md. The project is about what it means to be learning Node.js. I don't see a Graphs/Commits, but I see Commits. The Commits list is a list of activities performed for this project, including pull requests, branches and (apparently) changes such as "Improved markdown rendering of lists." Humanitarian has 332 projects. Under Humanitarian, HTBox/crisischeckin was last updated on April 22. Disaster Management has 174 results.

On to OpenHub... I see 225 pages with (I'm assuming...) 10 results per page except possibly the last page. The 225th page has 8 results. So the grand total is 2248. For the KDE project, there are 23 locations. I don't see any on GitHub. Four projects are listed as being similar to the KDE project. OpenHub provides lots of information about the KDE Ed project, including Project Summary, Quick Reference, License, and charts for Code, Activity, and Community. Humanitarian seems to have only 11 projects. Disaster Management has 29 projects. Lots of "Activity Not Available" for the Disaster Management projects. I'm not sure why. The organizations page lists organizations that contribute to OpenHub. THe most active are GNOME and Nuxeo. Others include Debian, Gentoo and KDE. The last commit for OpenMRS Core was on October 10. According to GitHub (as opposed to OpenHub) the last commit for OpenMRS Core was also on October 10.

As for benefits and drawbacks of using both GitHub and OpenHub, there might be information on one site that's not on the other site.


Project Evaluation Rubric for OpenMRS

Evaluation Factor Level
(0-2)
Evaluation Data
Licensing

2

https://opensource.org/licenses/MPL-2.0

Language

2

Java 95.4%; SQLPL 3.0%; GAP 0.7% (I'm not sure what GAP is, but it's less than 1%.)

Level of Activity

2

Only four weeks with no activity in the past year.

Number of Contributors

2

271 contributors

Product Size

1

220.81 MB

Issue Tracker

2

Ready for work: 1; Closed: 1; Seems to be quite active (? My Ready_for_work and Closed counts don't seem to be consistent with the questions in the POSSE exercise ?)

New Contributor

2

https://wiki.openmrs.org/display/docs/Getting+Started+as+a+Developer OpenMRS Talk (https://talk.openmrs.org/) (Most recent activity only a few hours old)

Community Norms

2

https://wiki.openmrs.org/display/docs/Code+of+Conduct (Be considerate, respectful, and collaborative)

I see no discrepectful discussion on the TALK page
User Base

2

Instructions for downloading at https://github.com/openmrs/openmrs-core#build, see also http://openmrs.org/download/ Demos at http://openmrs.org/demo/

Total Score

17

Some of my students need to work on very small projects. Otherwise, this would be ideal.

Notes on Licenses

OpenMRS license is Mozilla Public License 2.0 with Healthcare Disclaimer

 Can: Commercial use, Modify, Distribute, Sublicense, Place warranty, Use patent claims
 Cannot: Hold liable
 Must: Include copyright, Include license, Disclose source, Include original

apache/fineract license is Apache License Version 2.0

 Can: Commerical use, Modify, Distribute, Sublicense, Private use, Use patent claims, Place warranty
 Cannot: Hold liable, Use trademark
 Must: Include copyright, include license, State changes, include notice

the regulately license is MIT

 Can: Commercial use, Modify, Distribute, Sublicense, Private use
 Cannot: Hold liable
 Must: Include copyright and license

Any of these licenses seem reasonable for me to use but, honestly, I don't have enough experience to choose among them.

FOSS in My Courses

In Andy Lester's 14 Ways to Contribute to Open Source without Being a Programming Genius or a Rock Star I found four activities that caught my attention as being viable for beginning students (CS 1 students, for example). They are...

  • Test a beta or release candidate
  • Work with documentation; create an example
  • Suggest new features or options
  • Translate into another language

Every course that I teach has a different feel to it, so I don't like making detailed plans for an activity until I've gotten the feel for a class and its students. But these four activities seem accessible for students who don't have a lot of coding experience and don't feel comfortable looking at large quantities of code.

Beta testing is something that almost anyone can do, and testing by novices often yields bugs (or uncomfortable features) that professionals might not catch. Working with documentation is good because people can use the software, struggle through the documentation, and add examples/anecdotes/explanations of their own. As an author, I value the ability to put technical ideas into words. Suggesting new features and options is always valuable, and it probably comes as a natural side-effect of Item 1 (testing a beta or release candidate). As for the fourth activity (translating into another language) we have many international students at Drew so they'd be in a good position to do this. (They'd have to check each others' work because I'm not even a beginner in many of the languages that they speak.)

Notes on Bug Tracking

For the GNOME Accessibility Bugs, here's are some column names:

I got much of this information from https://bugzilla.gnome.org/page.cgi?id=reports.html.

The bugs are initially displayed in increasing order of ID, but you can change this by clicking column headings.

The color of the issue is grey (low enhancement, for example), black (normal normal, for example), or red (high critical, for example).

For the "Overall On/Off status not indicated" issue, the bug was submitted in 2003. It was assigned to gnome-applets Maintainers. There's no recent discussion about the bug. To fix the bug, one needs to add a status indicator. This means figuring out how the status indicator should appear, finding out what API calls are needed to detect the status and to make the status appear that way, and then implementing the API calls in the code.

For the "make desktop icon text easier to read" issue, the bug was submitted in 2005 but the latest modification is August of 2017. Fixing this issue means either changing the specs on the font (if that can be done) or replacing the icon.

In the last week, 136 bug reports were opened and 145 were closed. (That's good!)

The top bug closers were Alexandre Franke, Florian Müllner and Philip Withnall. The top bug reporters were Ralf, Debarshi Ray and Christian Persch. There's some overlap in the lists of closers and reporters, but not a lot. The top patch reviewers were Philip Withnall, Sebastian Dröge and Rui Matos. Again, some overlap. Three or four names in common between patch reviewers and patch contributors.

Based on a graphically generated report, the majority of bugs for orca braille were normal. The system seems flexible enough to generate reports with any x-axis and any y-axis for any product.

Structuring FOSS Activities in my CS1 Course

Each of these activities (listed below) are suitable for in-class activities (spread over several class periods), for homework, and for projects. Students work independently or in groups.

  • Test a beta or release candidate
   Learning outcome: To learn not to be satisfied with code that minimally works; to become more critical of code that they write 
   Prerequisite knowledge: Almost none
   Instructor prep time: (Unknown)
   Student completion time: 5 hours or more
   Need to sync with the HFOSS community schedule: No
   Input required from HFOSS community: None
   Usefulness of the activity's results being contributed back to the project: It's possible that a student will find a bug that hasn't been detected by anyone else
   Grading: The work need not be accepted by the community. Grading is based on the degree to which the student has done a critical assessment of the beta or release candidate (as opposed to a superficial assessment).
   My questions about this activity: With some HFOSS projects, will it be difficult or time-consuming for me to assess the value of the student's contribution?
   Possible stumbling blocks: What if the project is very mature so that tests by novices don't yield any bugs?
  • Work with documentation; create an example
   Learning outcome: To become a better technical writer
   Prerequisite knowledge: Writing skills
   Instructor prep time: (Unknown)
   Student completion time: 5 hours or more
   Need to sync with the HFOSS community schedule: No
   Input required from HFOSS community: None
   Usefulness of the activity's results being contributed back to the project: Examples in the documentation have obvious benefit, especially if they've been supplied by an inexperienced user
   Grading: The work need not be accepted by the community. Grading is based on the clarity of the documentation that the student provides.
   My questions about this activity: (None)
   Possible stumbling blocks: (None that I can think of)
  • Suggest new features or options
   Learning outcome: To better understand user experience issues
   Prerequisite knowledge: Almost none
   Instructor prep time: (Unknown)
   Student completion time: 3 hours or more
   Need to sync with the HFOSS community schedule: No 
   Input required from HFOSS community: None
   Usefulness of the activity's results being contributed back to the project: Great if the new feature or option is eventually adopted
   Grading: The work need not be accepted by the community. Grading is based on the appropriateness of the new features or options.
   My questions about this activity: With some HFOSS projects, will it be difficult or time-consuming for me to assess the value of the student's contribution?
   Possible stumbling blocks: This might be quite difficult for students, so students might be tempted to suggest superficial or less-than-useful features/options.
  • Translate into another language
   Learning outcome: To become a better technical writer
   Prerequisite knowledge: Near fluency in a language not currently supported by the project
   Instructor prep time: (Unknown)
   Student completion time: Varies widely with the amount of text being translated
   Need to sync with the HFOSS community schedule: No 
   Input required from HFOSS community: None
   Usefulness of the activity's results being contributed back to the project: May expand the user base for the project
   Grading: All the better if the work is accepted by the community because this work will be in languages that I don't know and cannot evaluate.
   My questions about this activity: Is there a reliable way to assess the student's contribution without waiting for the community's response?
   Possible stumbling blocks: What if no one is available to verify that a student's translation is adequate?
Personal tools
Namespaces
Variants
Actions
Events
Learning Resources
HFOSS Projects
Evaluation
Navigation
Toolbox