User:Jpaul

From Foss2Serve
Revision as of 20:44, 25 March 2017 by Jpaul (Talk | contribs)
Jump to: navigation, search

Contents

Jody Paul

Jody Paul is on the faculty of the Mathematical and Computer Sciences department at Metropolitan State University of Denver (MSU Denver), an urban institution with a modified, open-admission policy.

Dr. Paul's main areas of interest include Computer Science Education, Software Engineering, and Cognitive Science.

Observations made while exploring HFOSS projects

Sugar Labs

Contributions

  • The roles most applicable for my students: Content Writer, Developer, Designer, Translator
  • Commonalities and differences across roles: All involve "communicate" (other than Educator); Each represents a different skill set

Tracker

  • The general process for submitting a bug involves using Trac for issue tracking. A bug is reported by creating and filing a new ticket. A registered login is necessary.
  • The types/categories of tickets listed include: defect, enhancement, and task. There is much information available for each ticket, such as: Reporter (author), Type, Component, Version, Keywords, Priority, Milestone, Owner, Cc list, Resolution, Status, Summary, and Description

Repository

  • As of Mar 7, 2017, the date of the last commit was Feb 5, 2017, but this appears to correspond to a commit from Oct 10, 2016.

Release cycle

  • Each release cycle includes "development, beta, release candidate and final releases."[1] The roadmap provides greater detail with scheduled release dates and freeze points. The most recent item on the roadmap schedule is dated 1 October 2016.
Sahana Eden

Community

  • The grouping of contributors is relatively similar to that found in Sugar Labs but appears to be more focused on development than communication. The types of contributions associated with three of the groups include:
    • Developers - Code level design and implementation contributions
    • Testers - Both manual and automated testing; allows for non-technical contributions
    • Designers - Primarily user interface design contributions

Tracker can be found here. Place your answers to the following on your wiki page.

  • The Sahana Eden project, like Sugar Labs, uses Trac for issue tracking. The information contained in a report of active tickets [2] is essentially the same as in the Sugar Labs report with minor difference in which fields are shown (e.g., Sahana Eden shows "Component", "Version", and "Created" fields whereas Sugar Labs shows "Milestone") and Sahana includes a distinct "documentation" type. Sahana Eden also provides more pre-constructed reports to facilitate searching and viewing the Trac database (compare [3] with [4]).

Repository

  • As of Mar 7, 2017, the date of the most recent commit was Mar 7, 2017.

Release cycle

  • Sahana Eden appears to use the Milestone and Roadmap structure from Trac. The roadmap at the website [5] appears to lack updating, with Milestone 0.9 shown as "5 years late" and Milestone 1.0 as "Planned for: May 2012 (Draft)".

FOSS Field Trip

GitHub

Search for term "education" produced: "We’ve found 11,995 repository results"

First project: "vhf/free-programming-books". Graphs>Commits indicates repository commit activity, weekly and daily.

Search for term "humanitarian" produced: "We’ve found 284 repository results"

The HTBox/crisischeckin project (as of 20 March 2017) shows: "Updated on Nov 4, 2016"

Search for phrase "disaster management" produced: "We’ve found 138 repository results"

OpenHub

Search for term "education" indicated 347 pages. Page 347 has one project. Assuming all other pages have 10, there are 3461 projects.

All KDE Education code locations appear to be at "anongit.kde.org", none indicate "github.com".

10 projects are identified as being similar to KDE Education.

OpenHub provides project information such as summaries and news, as well as data about the code base, repository activity, and contributors.

Search for term "humanitarian" returned 34 projects.

Search for phrase "disaster management" returned 54 projects.

"Activity Not Available" is described begin associated with "projects that do not have recent analysis because of problems with their code locations or other problems blocking Open Hub from collecting and analyzing code". (http://blog.openhub.net/about-project-activity-icons/ accessed 20 March 2017)

The "Orgs Explore - Open Hub" page shows activity statistics arranged by organization and type of organization.

The OpenMRS Core project on OpenHub (as of 20 March 2017) shows a last commit date of "18-August-2016". It also shows "Activity Not Available" (described previously).

The OpenMRS Core project on GitHub (as of 20 March 2017) shows multiple commits on 20 March 2017.

The "Activity Not Available" indicator on OpenHub points to a disconnect between the information available to and shown on OpenHub and the project repository hosted on GitHub.

GitHub and OpenHub provide different information with some overlap. Understanding the limitations of those helps in interpreting search results at each site. SourceForge is yet another such site that may also warrant searching. Perhaps using general search engines (e.g., Google, Bing, Yahoo, DuckDuckGo, ...) may be more beneficial for locating projects than using specific repository hosts' search features.

Project Evaluation

Walk through of an evaluation of the OpenMRS project

The table below contains entries for each of the evaluation criteria in the Project Evaluation Learning Activity. The evaluation for each criterion is recorded in the "Evaluation Data" column. A score is assigned in the level column using zero to indicate that the criterion is not met at all, two to indicate that the criterion is fully met, and 1 for something in between. The project overall total indicates the sum of scores for each of the criteria.

Evaluation Key
  • Licensing - Score 2 if the product has a free software or open source software license. Score 0 for other licenses or if the license is missing
  • Language - Score 2 if the language is your most preferred choice. Score 1 for less preferred languages or if your preferred language is only a small part of the product. Score 0 if the language is not suitable for your needs
  • Level of Activity - Score 2 if you judge all the quarters in the last year as being active. Score 1 if some of the quarters in the last year have been active. Score 0 if there have been no active quarters in the last year.
  • Number of Contributors - Score 2 if there are 10 or more contributors. Score 1 if there are 3-10 contributors. Score 0 if there are only 1 or 2 contributors. Note that these numbers are based on the fact that most projects have only 1-2 contributors, and the score assumes you are interested in contributing to a larger, clearly established project. If you would prefer to work with a smaller, less well-established project then adjust your scoring to reflect that.
  • Size - Scoring for size depends on your objectives in contributing to a project. A project with little or no code should probably be scored 0. For projects that have an established code base, you might think about whether there is a "sweet spot" for code base size that you think would be ideal for your needs. If you can define that, then score projects in that range as 2. Score projects that are neither 0 or 2 as 1. If you don't know what size would be appropriate, then score anything over a reasonable minimum (suggestion: 10,000 lines) as 1.
  • Issue Tracker - Score 2 if issues are being actively added and resolved. Score 0 if there is no issue tracker or no sign of recent activity. Score 1 if there is activity but it is very low or sporadic.
  • New Contributor - Score 2 if there are clear instructions and welcome for new contributors (positive answers to at least 3 of the learning activity questions). Score 0 if there is little or no evidence of welcome or instructions for new contributors. Score 1 for anything in between.
  • Community Norms - Score 2 if there is a documented and easy to locate statement of community norms that is welcoming and inclusive. Score 0 if there is any evidence of rude, unprofessional, harassing or other undesirable behavior. Score 1 if there are no signs of poor behavior but there is no stated code of conduct.
  • User base - Score 2 if there clearly is an active and engaged user base. Score 0 if there is little or no evidence that the product is actually being used by anyone beyond the development team. Score 1 if there is some evidence of use but not much.
Evaluation Table
Evaluation Factor Level
(0-2)
Evaluation Data
Licensing 2 Mozilla Public License 2.0 (MPL-2.0)
Language 2 Java (95.5%), SQLPL (2.9%), GAP (0.7%)
Level of Activity 1 Relatively low activity over past 12 months
Number of Contributors 1 253 contributors are listed. Looking at the graphs, less than a handful have significant activity in the past 5 years. Looking at the pulse for the last month, however, "16 authors have pushed 84 commits to master and 116 commits to all branches.")
Product Size 0 218.35 MB (hundreds of thousands of lines of text if 1 character = 1 byte)
Issue Tracker 1 No issues logged via GitHub
Ready for Work: 0 + 10 + 87 + 351 + 251 + 83 + 471 = 1253
Closed: 1 + 323 + 1615 + 3058 + 1252 + 487 + 3112 = 9848
Low/Sporadic activity
New Contributor 2 The Developer Guide is a very useful starting point. Getting Started as a Developer is somewhat welcoming and accessible. The OpenMRS Developers Guide is rather daunting.
Community Norms 2 The OpenMRS Code of Conduct is based on the Ubuntu Code of Conduct, addresses norms of collaborative behavior, and specifies penalties associated with violations of the code. Arbitrary conversations read in OpenMRS Talk appear to adhere to the code, be relevant, and often provide links to documented information.
User Base 2 Looking at the atlas there appear to be numerous locations at which OpenMRS is or has been used. Clicking on numerous Clinical sites, however, shows the notation "Why is this site fading away?" The 2016 Annual Report states that "Based on documented reports, OpenMRS is currently in use in 1,845 locations around the world." Instructions are provided for downloading, setting up, and using the software.
Total Score 13 This provided a very useful starting point for assessing appropriateness of an HFOSS project for use in a course. Having had this experience, I can see where and how I would choose to modify the criteria and the interpretation of evaluation data to facilitate appropriate project choice. For example, using differently-scaled Levels and a weighted sum for "Total Score" may better reflect the variance and relative importance of the evaluation factors.
Personal tools
Namespaces
Variants
Actions
Events
Learning Resources
HFOSS Projects
Evaluation
Navigation
Toolbox