Project Evaluation Activity V1
Contents |
Project Evaluation
Preparation:
Description | Learners will gain an understanding of the breadth of available FOSS projects. Learners will also gain an understanding of the identifying characteristics of FOSS projects including pattern of contributions, patterns of commits, programming languages used, and more. |
Source | |
Prerequisite Knowledge | Completion of Browsing a Forge Activity or understanding of SourceForge and Ohloh; Understanding of course in which students will be participating in an HFOSS project. |
Estimated Time to Completion | 60-90 minutes |
Learning Objectives | Ability utilize the rubric to identify likely HFOSS projects. |
Materials/Environment | SIGCSE paper Access to Internet/Web and web browser. |
Additional Information | List of projects |
Rights | Licensed CC BY-SA |
Turn In | Wiki posting of evaluation of a project from the list of projects |
Background:
This activity is intended to give you an overview of what to consider when evaluating a FOSS project for student participation and for you to gain experience using the rubric.
Directions:
Part 1-Learn about the rubric
- Watch the video describing mission-critical criteria
- Watch the video describing secondary criteria
Part 2-Proceed through a walkthrough of an evaluation of the Mifos project
Evaluate Viability
- Size/Scale/Complexity - An ideal project should be neither overly simple nor overly complex. One heuristic to use is the number of contributors as an indicator of project complexity.
- Go to Ohloh.net, type Mifos into the Search Projects box.
- On the results page click on Mifos to see the Project Summary page.
- Scroll down to the Community area and calculate the average number of contributors in the last 12 months. The average was 9 so it passed the minimum average number of contributors metric of 6.
- Go to the Mifos web page and choose Tech Overview from the Contributors tab. From examination of the technology stack, the architecture looks modular and further search shows it is documented elsewhere on the site.
- Result-Based on the modular design and meeting the minimum average number of contributors metric, the project is scored a 2 for size/scale/complexity.
- Activity - To support student participation a project should be reasonably active. Number of committers can be used as an indicator of activity.
- Return to the Mifos project summary page in Ohloh. Scroll to the Activity area on the page.
- Compute the 12-month average of commits. The 12-month average was about 108, much higher than the minimum average level of commits recommended for activity.
- Result-Because commits exceed the favorable level of activity for this project it may be a little large/complex. However, still appears manageable, the project is scored a 2 for activity.
- Community - A suitable project has an active user community. While it is difficult to quantitatively evaluate the activity of a user community, some indicators include a regular history of project downloads and documentation updates over time, current activity on user mailing lists, and testimonials on the project web site.
- Examine download activity
- Go to Sourceforge.net and enter Mifos into the search box.
- Choose Mifos-Microfinance Open Source from the search results.
- Click on the number of downloads that is listed on the project page.
- Change the date range to give a graph of downloads over the last year.
- Examine user mailing list activity
- Examine the IRC logs
- Result-Downloads appear steady so the project has a community of users. Developers are responsive on mailing list and have a presence on IRC. Project is scored a 3.
- Examine download activity
Evaluate Approachability
Here you are evaluating a project's on-ramp to contribution, scoring as follows:
- 1-Insufficient-Few or no pointers on how to become involved.
- 2-Sufficient-Suggestions about how to get involved other than contributing money with accompanying high-level instructions.
- 3-Ideal-Obvious link to get started, list of suggestions for things to do and detailed instructions.
- Link to get started-There is a Get Started page with links to what Mifos is, how to contribute, community processes, and tools used.
- List of suggestions for things to do - The Volunteer Project page provides a list of ways to contribute including testing, translation, development and documentation. There is also a volunteer bug queue listed as a good way for developers to get started.
- Detailed instructions- On the web site instructions and information are provided in many areas including process, architecture, licensing, product functionality, and developer documentation.
- Result-Was scored a 3.
Evaluate Suitability
- Appropriate Artifacts -Since evaluation is dependent on class objectives, in this example we'll assume an objective is to learn the process of working in authentic development project through contributing bug fixes.
- Opportunities to contribute bug fixes - Examined the volunteer bite-sized bug queue. There were 10 open bugs for new contributors. There were many more listed for more experienced contributors.
- Documentation on how to contribute bug fixes - From the Tech Overview page there are links to details on the code submission process.
- Result - May score a 1, 2 or 3 depending on the number of bugs suitable for students to tackle and class size.
- Contributor Support-Does the project have a high volume of guidance to help students as they learn?
- Are communication tools documented?-Communication tools are documented under the Collaboration and Communication section of the Development Tools page. Instructions on how to access the mailing lists with tips on how to participate are available from the Communications page.
- Do developers have a web presence?-Examination of IRC logs shows scattered activity over the last week.
- Are operating processed documented?-Links to information about coding standards, code submission process, and commit privileges process can be found on the Tech Overview page. The process for making feature requests and for prioritizing feature request is available on the Roadmap page.
- Do questions posed have timely and supportive answers?-Responses to user mailing list and developer mailing list over the last month have timely and supportive responses.
- Result - Not a lot of activity on IRC, but mailing lists show lots of timely feedback and communication methods and operating procedures are well documented, score a 3.
Overall evaluation for mission critical criteria - Since no mission-critical criteria were scored lower than a 2 the project is then evaluated on secondary criteria. Otherwise, the project would have been considered not suitable for student participation.
Evaluate Viability
- Domain
- Does this project require domain knowledge that may be difficult for students to learn? - As a domain microfinance students should be able to grasp it well enough to contribute a bug fix, which is the learning objective assumed in this example.
- Result - Score a 2 since the domain isn’t as simple to grasp as say a desktop application for word processing or compressing files.
- Maturity
- To have the organization to support student learning, the project should have at least one stable production release - The roadmap page lists releases.
- Result - The Download Mifos page says 2.6.0 is the 4th major community-supported release. Scored a 3.
- User Support
- The project should have clear instructions for downloading, installing, and using the project - There is a demo server, video, and slide presentation that explains system functionality. This information can be found looking at pages listed under the Product tab that can be used to learn about the system. There is also a user manual available. On the Download Mifos page, there are detailed instructions related to installation, configuration, system requirements, and troubleshooting.
- Result - Given the wealth of detailed documentation, score a 3.
- Roadmap
- Student learning is best supported by projects that have a roadmap that includes new feature development, a method for users to submit new feature requests and a process for identifying how new features are prioritized - The process for making feature requests and for prioritizing feature request is available on the Roadmap page. The roadmap has features that were implemented in the last release, but no listing for the next release.
- Result - Scored a 2 because there is no information listed for feature planning in the next release.
Evaluate Approachability
- Contribution Types
- Does the project contain opportunities for multiple types of contribution and of the type that fits the class? - There are multiple projects for testers, tech writers, and developers. These can be seen on the Volunteer Projects page.
- Result - May be a 1, 2 or 3 depending on whether the number of bugs is suitable for students is enough given the class size.
- Openness to Contributions
- Acceptance of a student contribution to a project provides valuable affirmation to student learning. Determine whether the project accepts student patches. - The process for contribution is documented on the Tech Overview page.
- Result - Score a 3 because the contribution process is documented.
- Student Friendliness
- Do community members moderate the tone of communication? Review mailing lists and IRC to gauge tone - Review of user mailing list and developer mailing list and IRC logs during evaluation of contributor support showed a positive tone during communication.
- Result - Score a 3, no inappropriate or demeaning messages.
Evaluate Suitability
- Project Description
- Students must be able to understand the purpose of the project. Does the project clearly describe the product? Can students understand the intended uses of the product? - The About tab on the web page has links to the vision for the product and how it is used by microfinance institutions.
- Result - Score a 3, how the product is used and the vision for it is well documented and should be understandable by students.
- Platform
- What software and hardware platform does the FOSS project run on? - Development environment can be built on Windows, Ubuntu or Mac desktop completely with FOSS software. (Project development information found here)
- Are there resources to support these platforms? - In this example, yes.
- Are students familiar with the platforms? - In this example, yes.
- Result - Score a 2, assumption in this example is students all have newer personal computers and given the ability to set up a development environment on different operating systems that makes the availability of student resources greater. However, there is some risk because machine requirements for setting up developer environment are not provided and some documentation may be out of date.
- Development Features - Is the class dependent on specific development features? (Project development information found here)
- Programming language - Is primarily Java.
- Development environment - Can be built on Windows, Ubuntu or Mac completely with FOSS software.
- Supporting technologies - Suggested IDE is Eclipse, requires Maven, Jetty, and mySQL.
- Result - Need to gauge this on knowledge of students and requirements of class. Assumption here is students know Java and are familiar with mySQL. While students are not familiar with Maven and Jetty this may not be necessary for intro bug fix plus the community is very supportive so assistance can be found there. Given there is some risk, score a 2.
Overall evaluation for secondary criteria - Total score for criteria is over 20, the project passes. However, criteria scoring below 1 and criteria for which there was some risk noted should be reexamined to see if steps can be taken to mitigate risk.