Project Evaluation Activity V2

From Foss2Serve
Jump to: navigation, search


Title

Project Evaluation Activity

Overview

Learners will gain an understanding of the breadth of available FOSS projects. Learners will also gain an understanding of the identifying characteristics of FOSS projects including pattern of contributions, patterns of commits, programming languages used, and more.

Prerequisites

Completion of Browsing a Forge Activity or understanding of SourceForge and OpenHub; Understanding of the course in which students will be participating in an HFOSS project.

Learning
Objectives
After successfully completing this activity, the learner should be able to:

Ability to utilize the rubric to identify likely HFOSS projects.

Process Skills
Practiced


Background

List of HFOSS projects

This activity is intended to give you an overview of what to consider when evaluating an HFOSS project for student participation and for you to gain experience using the rubric.

Directions

Part 1-Learn about the rubric

Watch these videos introducing the FOSS project evaluation criteria:

  1. Mission critical criteria
  2. Secondary criteria

Or read the SIGCSE paper on evaluating FOSS projects

Part 2-Walk through of an evaluation of the OpenMRS project - Use the blank evaluation template to record your results and the rationale for your scoring.

Mission Critical criteria-Viability Recall that each component is given a score from 1 to 3, where 3 is the best.
  1. Size/Scale/Complexity - An ideal project should be neither overly simple nor overly complex. One heuristic to use is the number of contributors as an indicator of project complexity.
    1. Go to the OpenMRS web page (http://openmrs.org/), scroll to the bottom and choose OpenMRS Wiki (under Other OpenMRS sites). From the menu on the left expand the Developer Guide and the Getting Started as a Developer options and then choose Technical Overview. From examination of the technology stack, the architecture looks modular and further search shows it is documented elsewhere on the site. This provides a first look at the complexity of the application and the number and various different technologies involved.
    2. Based upon the results from OpenHub (gathered in the FOSS Field Trip activity) and the information from the OpenMRS Technical Overview page, think about the size of the code base and how many different technologies and layers are involved in the application. What would you score this project for size/scale/complexity?
  2. Activity - To support student participation a project should be reasonably active. Number of commits can be used as an indicator of activity.
    1. Based upon the number of commits (gathered in the FOSS Field Trip activity) how would you rate the activity of the project?
  3. Community - A suitable project has an active user community. While it is difficult to quantitatively evaluate the activity of a user community, some indicators include a regular history of project downloads and documentation updates over time, current activity on user mailing lists, and testimonials on the project web site.
    1. Examine download activity
      1. Go to sourceforge.net and enter OpenMRS into the search box.
      2. Choose OpenMRS from the search results.
      3. Click on the number of downloads that is listed on the project page.
      4. Change the date range to give a graph of downloads over the last year.
    2. OpenMRS has begun migrating legacy mailing list activity to OpenMRS Talk. Examine discussion activity
    3. Examine the IRC logs
    4. Based upon the download history, discussion activity, and IRC activity, what score would you give this project for community?


Mission Critical criteria-Approachability
Here you are evaluating a project's on-ramp to contribution, scoring as follows:
1-Insufficient-Few or no pointers on how to become involved.
2-Sufficient-Suggestions about how to get involved other than contributing money with accompanying high-level instructions.
3-Ideal-Obvious link to get started, list of suggestions for things to do and detailed instructions.
  1. Examine project on-ramp.
    1. Link to getting started - The website has a Get Involved page with links to ways you can contribute and share your ideas.
    2. Each of the links (Develop, Test, Document, Translate) contain more detailed information about what and how you can contribute.
    3. The Getting Started as a Developer page contains a detailed list of how to get started including a list of introductory issues.
    4. Detailed instructions - The Developer Guide contains instructions and information in many areas including process, architecture, tools, and developer documentation.
    5. Based upon the resources you looked at, how would you rate the approachability of the OpenMRS project?


Mission Critical criteria-Suitability
  1. Appropriate Artifacts - Since evaluation is dependent on class objectives, in this example we'll assume the objective is to learn the process of working in an authentic development environment by contributing bug fixes to OpenMRS.
    1. Opportunities to contribute bug fixes - Examine the issues found at the bottom of the getting started as a developer page. Note that there are two categories of introductory issues. How many are listed in each category?
    2. Documentation on how to contribute bug fixes - On the Tickets page there is information on how to create and work on an issue, including links to coding standards and the code submission process. Review this information.
    3. Based upon the number of bugs suitable for students to tackle and information on the process of how to submit bug fixes, how would you rate OpenMRS?
  2. Contributor Support - Does the project have a high volume of guidance to help students as they learn?
    1. Communication Tools - Communication tools are directly available from any of the Wiki Spaces (Documentation, Projects, Resources). The Resources page contains links to OpenMRS Talk and IRC Chat, as well as links to group meetings (under Events), and training opportunities.
    2. Web Presence - Examine the IRC logs. Has there been activity during the last week?
    3. Operating Processes - Links to information about coding standards, the code submission process, and commit privileges can be found on the How-To Submit Code page. The process for making feature requests is available on the Tickets page. Are these processes well documented?
    4. Response to Questions - Review a few of the posts on the OpenMRS discussion platform. Do posts to this forum receive timely and supportive responses?
    5. How would you rate the support that newcomers to OpenMRS receive?


The evaluation template should be uploaded to your blog -- while working on your blog post position the cursor on the page where you would like the link to appear and click 'Add Media'.


Overall evaluation for Mission Critical criteria - If no mission-critical criteria were scored lower than a 2 the project should be then evaluated on secondary criteria. Otherwise, the project would have been considered not suitable for student participation.


Secondary criteria-Viability - Secondary criteria sections are OPTIONAL for the POSSE workshop assignment
  1. Domain
    1. Does this project require domain knowledge that may be difficult for students to learn? OpenMRS is a medical records system. Students should be able to grasp it well enough to contribute a bug fix, which is the learning objective assumed in this example.
    2. How would you rate the viability of OpenMRS?
  2. Maturity
    1. To have the organization to support student learning, the project should have at least one stable production release. The Platform Release Notes page lists releases.
    2. Does OpenMRS have enough of a stable base to support student learning? How would you rate it?
  3. User Support
    1. The project should have clear instructions for downloading, installing, and using the project. As noted previously, the Getting Started as a Developer page provides detailed information about setting up and using the required tools, in addition there are detailed instructions related to installation, configuration, system requirements, and troubleshooting, including videos.
    2. Rate the documentation for OpenMRS.
  4. Roadmap
    1. Student learning is best supported by projects that have a roadmap that includes new feature development, a method for users to submit new feature requests and a process for identifying how new features are prioritized. Feature requests are made through JIRA, the OpenMRS issue tracker. Road map planning and the process for prioritizing feature requests is available on the Technical Roadmap Planning page. Here you will find information about the planning process and how to participate in the planning process. The Technical Road Map page identifies features, their current status, and a point of contact, in addition to expected dates of completion.
    2. Based upon the roadmap provided, how would you rate OpenMRS?


Secondary criteria-Approachability - Secondary criteria sections are OPTIONAL for the POSSE workshop assignment
  1. Contribution Types
    1. Does the project contain opportunities for multiple types of contribution and of the type that fits the class? There are multiple projects for testers, tech writers, and developers. These can be seen on the Get Involved page.
    2. Result - May be a 1, 2 or 3 depending on whether the number of bugs is suitable for students is enough given the class size.
  2. Openness to Contributions
    1. Acceptance of a student contribution to a project provides valuable affirmation to student learning. Determine whether the project accepts student patches. The process for contribution is documented on the Tickets page.
    2. Result - Score a 3 because the contribution process is documented.
  3. Student Friendliness
    1. Do community members moderate the tone of communication? Review the discussion platform and IRC to gauge tone. Review the discussion platform and IRC logs during evaluation of contributor support showed a positive tone during communication.
    2. Result - Score a 3, no inappropriate or demeaning messages.


Secondary criteria-Suitability - Secondary criteria sections are OPTIONAL for the POSSE workshop assignment
  1. Project Description
    1. Students must be able to understand the purpose of the project. Does the project clearly describe the product? Can students understand the intended uses of the product? - The About page provides an overview of who, where, and what OpenMRS is, including a downloadable PDF file and a video. These describe the purpose of the project and how it is used around the world to efficiently manage medical records.
    2. Result - Score a 3, how the product is used and the vision for it is well documented and should be understandable by students.
  2. Platform
    1. What software and hardware platform does the FOSS project run on? Development environment can be built on Windows, Linux or Mac OS X completely with FOSS software. (Project development information found here)
    2. Are there resources to support these platforms? - In this example, yes.
    3. Are students familiar with the platforms? - In this example, yes.
    4. Result - Score a 2, assumption in this example is students all have newer personal computers and given the ability to set up a development environment on different operating systems that makes the availability of student resources greater. However, there is some risk because machine requirements for setting up developer environment are not provided and some documentation may be out of date.
  3. Development Features - Is the class dependent on specific development features? (Project development information found here)
    1. Programming language - Is primarily Java.
    2. Development environment - Can be built on Windows, Linux or Mac OS X completely with FOSS software.
    3. Supporting technologies - Suggested IDE is Eclipse, requires mySQL, Maven and Jetty are optional.
    4. Result - Need to gauge this on knowledge of students and requirements of class. Assumption here is students know Java and are familiar with mySQL. While students are not familiar with Maven and Jetty this may not be necessary for intro bug fix plus the community is very supportive so assistance can be found there. Given there is some risk, score a 2.


Overall evaluation for secondary criteria - Add up your scores to determine the overall score. If the total score for criteria is over 20, the project passes. However, criteria scoring below 1 and criteria for which there was some risk noted should be reexamined to see if steps can be taken to mitigate risk.


Deliverables

Wiki posting of evaluation of a project from the list of HFOSS projects


Assessment

  • How will the activity be graded?
  • How will learning will be measured?
  • Include sample assessment questions/rubrics.
Criteria Level 1 (fail) Level 2 (pass) Level 3 (good) Level 4 (exceptional)
The purpose of the project
Why the project is open source

Comments

  • What should the instructor know before using this activity?
  • What are some likely difficulties that an instructor may encounter using this activity?

Variants and Adaptations

POGIL-style combined FOSS Field Trip and Project Evaluation used by Chris Murphy in his FOSS Course, UPenn, Murphy.

Additional Information

ACM BoK
Area & Unit(s)
ACM BoK
Topic(s)
Difficulty
Estimated Time
to Complete

60-90 minutes This activity can take a significant amount of time. We only expect you to spend 60-90 minutes exploring. You may not complete the activity within this time. Of course you are welcome to spend more time if you wish.

Environment /
Materials
Author(s)
Source
License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License

CC license.png


Suggestions for Open Source Community

  • Suggestions for an open source community member who is working in conjunction with the instructor.
Personal tools
Namespaces
Variants
Actions
Events
Learning Resources
HFOSS Projects
Evaluation
Navigation
Toolbox