Project Evaluation Activity V1
(→Part 1-Evaluate Mission Critical Criteria) |
|||
(124 intermediate revisions by 7 users not shown) | |||
Line 1: | Line 1: | ||
− | + | __NOTOC__ | |
− | === | + | {{Learning Activity Overview |
+ | |title= | ||
+ | Project Evaluation Activity V1 | ||
+ | |overview= | ||
+ | Learners will gain an understanding of the breadth of available FOSS projects. Learners will also gain an understanding of the identifying characteristics of FOSS projects including pattern of contributions, patterns of commits, programming languages used, and more. | ||
+ | |prerequisites= | ||
+ | Completion of Browsing a Forge Activity or understanding of SourceForge and Ohloh; Understanding of course in which students will be participating in an HFOSS project. | ||
+ | |objectives= | ||
+ | * Utilize the rubric to identify likely HFOSS projects. | ||
+ | |process skills= | ||
+ | }} | ||
− | + | === Background === | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | This activity is intended to give you an overview of what to consider when evaluating a FOSS project for student participation and for you to gain experience using the rubric. | |
− | This | + | |
− | + | === Directions === | |
− | === | + | ====Part 1-Learn about the rubric==== |
− | ====Part | + | #[http://youtu.be/MAGet2D5o2c Watch the video describing mission critical criteria] |
− | ''' | + | #[http://youtu.be/e4lnIXjqczU Watch the video describing secondary criteria] |
− | #Size/Scale/Complexity | + | |
− | ##Go to Ohloh.net, type Mifos into the Search Projects box | + | ====Part 2-Walk through of an evaluation of the Mifos project==== |
− | ##Scroll down to the Community area and calculate the average number of contributors in the last | + | :'''Mission Critical criteria-Viability''' |
− | ##Go to the Mifos web page and choose Tech Overview from the Contributors tab. From examination of the technology stack, the architecture looks modular and further search shows it is documented elsewhere on the site. | + | #Size/Scale/Complexity - An ideal project should be neither overly simple nor overly complex. One heuristic to use is the number of contributors as an indicator of project complexity. |
− | ##Result | + | ##Go to Ohloh.net, type Mifos into the Search Projects box. |
− | #Activity | + | ##On the results page click on Mifos (above the Mifos logo) to see the Project Summary page. |
+ | ##Scroll down to the Community area and calculate the average number of contributors in the last 12 months -- note that the graph is interactive. The average was 9 so it passed the minimum average number of contributors metric of 6. | ||
+ | ##Go to the Mifos web page (http://www.mifos.org) and choose Tech Overview from the Contributors tab. From examination of the technology stack, the architecture looks modular and further search shows it is documented elsewhere on the site. | ||
+ | ##Result: Based on the modular design and meeting the minimum average number of contributors metric, the project is scored a 2 for size/scale/complexity. | ||
+ | #: | ||
+ | #Activity - To support student participation a project should be reasonably active. Number of committers can be used as an indicator of activity. | ||
##Return to the Mifos project summary page in Ohloh. Scroll to the Activity area on the page. | ##Return to the Mifos project summary page in Ohloh. Scroll to the Activity area on the page. | ||
− | ##Compute the 12-month average of commits. The 12-month average was about 108, | + | ##Compute the 12-month average of commits. The 12-month average was about 108, much higher than the recommended range of 10 - 30 commits per month. |
− | ##Result | + | ##Result: Because the number of commits for this project exceeds the favorable level of activity, it may be a little large/complex. However, it still appears manageable, the project is scored a 2 for activity. |
− | #Community | + | #: |
+ | #Community - A suitable project has an active user community. While it is difficult to quantitatively evaluate the activity of a user community, some indicators include a regular history of project downloads and documentation updates over time, current activity on user mailing lists, and testimonials on the project web site. | ||
##Examine download activity | ##Examine download activity | ||
###Go to Sourceforge.net and enter Mifos into the search box. | ###Go to Sourceforge.net and enter Mifos into the search box. | ||
Line 45: | Line 43: | ||
###Click on the number of downloads that is listed on the project page. | ###Click on the number of downloads that is listed on the project page. | ||
###Change the date range to give a graph of downloads over the last year. | ###Change the date range to give a graph of downloads over the last year. | ||
− | ##Examine | + | ##Examine [https://groups.google.com/forum/#!forum/mifosusers user mailing list] activity |
− | + | ##Examine the [http://ci.mifos.org/irclogs/%23mifos/ IRC logs] | |
− | # | + | ##Result: Downloads appear steady so the project has a community of users. Developers are responsive on mailing list and have a presence on IRC. Project is scored a 3. |
− | ##Examine the | + | |
− | + | ||
− | + | ||
− | ##Result | + | |
− | |||
− | Here you are evaluating a project's on-ramp to contribution, scoring as follows: | + | :'''Mission Critical criteria-Approachability''' |
+ | |||
+ | ::Here you are evaluating a project's on-ramp to contribution, scoring as follows: | ||
::1-Insufficient-Few or no pointers on how to become involved. | ::1-Insufficient-Few or no pointers on how to become involved. | ||
Line 63: | Line 58: | ||
::3-Ideal-Obvious link to get started, list of suggestions for things to do and detailed instructions. | ::3-Ideal-Obvious link to get started, list of suggestions for things to do and detailed instructions. | ||
− | #Link to get started- | + | #Link to get started-There is a [http://mifos.org/contributors/get-started Get Started page] with links to what Mifos is, how to contribute, community processes, and tools used. |
− | #List of suggestions for things to do - | + | #List of suggestions for things to do - [http://mifos.org/contributors/volunteer-projects The Volunteer Project page] provides a list of ways to contribute including testing, translation, development and documentation. There is also a volunteer bug queue listed as a good way for developers to get started. |
− | #Detailed instructions- | + | #Detailed instructions- On the web site instructions and information are provided in many areas including process, architecture, licensing, product functionality, and developer documentation. |
#Result-Was scored a 3. | #Result-Was scored a 3. | ||
− | ''' | + | |
+ | :'''Mission Critical criteria-Suitability''' | ||
#Appropriate Artifacts -Since evaluation is dependent on class objectives, in this example we'll assume an objective is to learn the process of working in authentic development project through contributing bug fixes. | #Appropriate Artifacts -Since evaluation is dependent on class objectives, in this example we'll assume an objective is to learn the process of working in authentic development project through contributing bug fixes. | ||
− | ##Opportunities to contribute bug fixes - Examined the volunteer bite-sized bug queue | + | ##Opportunities to contribute bug fixes - Examined the [https://mifosforge.jira.com/issues/?filter=10305 volunteer bite-sized bug queue]. There were 10 open bugs for new contributors. There were many more listed for more experienced contributors. |
− | ##Documentation on how to contribute bug fixes - From the Tech Overview page | + | ##Documentation on how to contribute bug fixes - From the [http://mifos.org/contributors/tech-overview Tech Overview page] there are links to details on the code submission process. |
− | ##Result - May score a 2 or 3 | + | ##Result - May score a 1, 2 or 3 depending on the number of bugs suitable for students to tackle and class size. |
#Contributor Support-Does the project have a high volume of guidance to help students as they learn? | #Contributor Support-Does the project have a high volume of guidance to help students as they learn? | ||
− | ##Are communication tools documented?- | + | ##Are communication tools documented?-Communication tools are documented under the Collaboration and Communication section of the [http://mifos.org/contributors/development-tools Development Tools page]. Instructions on how to access the mailing lists with tips on how to participate are available from the [http://mifos.org/community/communications Communications page]. |
− | ##Do developers have a web presence?- | + | ##Do developers have a web presence?-Examination of [http://ci.mifos.org/irclogs/%23mifos/ IRC logs] shows scattered activity over the last week. |
− | ##Are operating processed documented?- | + | ##Are operating processed documented?-Links to information about coding standards, code submission process, and commit privileges process can be found on the [http://mifos.org/contributors/tech-overview Tech Overview page]. The process for making feature requests and for prioritizing feature request is available on the [http://mifos.org/product/roadmap Roadmap page]. |
− | ##Do questions posed have timely and supportive answers?- | + | ##Do questions posed have timely and supportive answers?-Responses to [https://groups.google.com/forum/#!forum/mifosusers user mailing list] and [https://groups.google.com/forum/#!forum/mifosdeveloper developer mailing list] over the last month have timely and supportive responses. |
+ | ##Result - Not a lot of activity on IRC, but mailing lists show lots of timely feedback and communication methods and operating procedures are well documented, score a 3. | ||
− | |||
− | + | :'''Overall evaluation for Mission Critical criteria''' - Since no mission-critical criteria were scored lower than a 2 the project is then evaluated on secondary criteria. Otherwise, the project would have been considered not suitable for student participation. | |
− | + | ||
− | ''' | + | |
+ | :'''Secondary criteria-Viability''' | ||
#Domain | #Domain | ||
− | #Maturity | + | ##Does this project require domain knowledge that may be difficult for students to learn? - As a domain microfinance students should be able to grasp it well enough to contribute a bug fix, which is the learning objective assumed in this example. |
− | #User Support | + | ##Result - Score a 2 since the domain isn’t as simple to grasp as say a desktop application for word processing or compressing files. |
− | #Roadmap | + | #Maturity |
+ | ##To have the organization to support student learning, the project should have at least one stable production release - The [http://mifos.org/product/roadmap roadmap page] lists releases. | ||
+ | ##Result - The [http://mifos.org/product/download-mifos Download Mifos page] says 2.6.0 is the 4th major community-supported release. Scored a 3. | ||
+ | #User Support | ||
+ | ##The project should have clear instructions for downloading, installing, and using the project - There is a demo server, video, and slide presentation that explains system functionality. This information can be found looking at pages listed under the Product tab that can be used to learn about the system. There is also a [http://mifos.org/support/user-manual user manual] available. On the [http://mifos.org/product/download-mifos Download Mifos page], there are detailed instructions related to installation, configuration, system requirements, and troubleshooting. | ||
+ | ##Result - Given the wealth of detailed documentation, score a 3. | ||
+ | #Roadmap | ||
+ | ##Student learning is best supported by projects that have a roadmap that includes new feature development, a method for users to submit new feature requests and a process for identifying how new features are prioritized - The process for making feature requests and for prioritizing feature request is available on the [http://mifos.org/product/roadmap Roadmap page]. The roadmap has features that were implemented in the last release, but no listing for the next release. | ||
+ | ##Result - Scored a 2 because there is no information listed for feature planning in the next release. | ||
+ | |||
− | ''' | + | :'''Secondary criteria-Approachability''' |
#Contribution Types | #Contribution Types | ||
+ | ##Does the project contain opportunities for multiple types of contribution and of the type that fits the class? - There are multiple projects for testers, tech writers, and developers. These can be seen on the [http://mifos.org/contributors/volunteer-projects Volunteer Projects page]. | ||
+ | ##Result - May be a 1, 2 or 3 depending on whether the number of bugs is suitable for students is enough given the class size. | ||
#Openness to Contributions | #Openness to Contributions | ||
+ | ##Acceptance of a student contribution to a project provides valuable affirmation to student learning. Determine whether the project accepts student patches. - The process for contribution is documented on the [http://mifos.org/contributors/tech-overview Tech Overview page]. | ||
+ | ##Result - Score a 3 because the contribution process is documented. | ||
#Student Friendliness | #Student Friendliness | ||
+ | ## Do community members moderate the tone of communication? Review mailing lists and IRC to gauge tone - Review of [https://groups.google.com/forum/#!forum/mifosusers user mailing list] and [https://groups.google.com/forum/#!forum/mifosdeveloper developer mailing list] and [http://ci.mifos.org/irclogs/%23mifos/ IRC logs] during evaluation of contributor support showed a positive tone during communication. | ||
+ | ##Result - Score a 3, no inappropriate or demeaning messages. | ||
− | ''' | + | |
+ | :'''Secondary criteria-Suitability''' | ||
#Project Description | #Project Description | ||
+ | ##Students must be able to understand the purpose of the project. Does the project clearly describe the product? Can students understand the intended uses of the product? - The [http://mifos.org/about About tab] on the web page has links to the vision for the product and how it is used by microfinance institutions. | ||
+ | ##Result - Score a 3, how the product is used and the vision for it is well documented and should be understandable by students. | ||
#Platform | #Platform | ||
− | #Development Features | + | ##What software and hardware platform does the FOSS project run on? - Development environment can be built on Windows, Ubuntu or Mac desktop completely with FOSS software. (Project development information found [https://mifosforge.jira.com/wiki/display/MIFOS/Workspace+2.0 here]) |
+ | ##Are there resources to support these platforms? - In this example, yes. | ||
+ | ##Are students familiar with the platforms? - In this example, yes. | ||
+ | ##Result - Score a 2, assumption in this example is students all have newer personal computers and given the ability to set up a development environment on different operating systems that makes the availability of student resources greater. However, there is some risk because machine requirements for setting up developer environment are not provided and some documentation may be out of date. | ||
+ | #Development Features - Is the class dependent on specific development features? (Project development information found [https://mifosforge.jira.com/wiki/display/MIFOS/Workspace+2.0 here]) | ||
+ | ##Programming language - Is primarily Java. | ||
+ | ##Development environment - Can be built on Windows, Ubuntu or Mac completely with FOSS software. | ||
+ | ##Supporting technologies - Suggested IDE is Eclipse, requires Maven, Jetty, and mySQL. | ||
+ | ##Result - Need to gauge this on knowledge of students and requirements of class. Assumption here is students know Java and are familiar with mySQL. While students are not familiar with Maven and Jetty this may not be necessary for intro bug fix plus the community is very supportive so assistance can be found there. Given there is some risk, score a 2. | ||
+ | |||
+ | |||
+ | :'''Overall evaluation for secondary criteria''' - Total score for criteria is over 20, the project passes. However, criteria scoring below 1 and criteria for which there was some risk noted should be reexamined to see if steps can be taken to mitigate risk. | ||
+ | |||
+ | ====Part 3-Evaluate a project==== | ||
+ | #Choose a project to evaluate from the [http://www.foss2serve.org/index.php/HFOSS_Communities project list] project list | ||
+ | #Use the [http://www.foss2serve.org/images/foss2serve/0/0c/Blank_Evaluation_Template.xlsx blank evaluation template] to help record your results. | ||
+ | #[http://www.foss2serve.org/index.php/Special:Upload Upload your evaluation.] | ||
+ | #Add a link on your user page to the evaluation you uploaded. | ||
+ | |||
+ | === Deliverables === | ||
+ | |||
+ | Wiki posting of evaluation of a project from the [[HFOSS_Communities|list of HFOSS projects]]. | ||
+ | |||
+ | |||
+ | === Assessment: === | ||
+ | * ''How will the activity be graded?'' | ||
+ | * ''How will learning will be measured?'' | ||
+ | * ''Include sample assessment questions/rubrics.'' | ||
+ | |||
+ | {| border="1" class="wikitable" | ||
+ | ! Criteria | ||
+ | ! Level 1 (fail) | ||
+ | ! Level 2 (pass) | ||
+ | ! Level 3 (good) | ||
+ | ! Level 4 (exceptional) | ||
+ | |- | ||
+ | | '''The purpose of the project''' | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |||
+ | |- | ||
+ | | '''Why the project is open source''' | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | | | ||
+ | |||
+ | |} | ||
+ | |||
+ | === Comments === | ||
+ | * ''What should the instructor know before using this activity?'' | ||
+ | * ''What are some likely difficulties that an instructor may encounter using this activity?'' | ||
+ | |||
+ | |||
+ | === Additional Information: === | ||
+ | |||
+ | {{Learning Activity Info | ||
+ | |acm unit= | ||
+ | |acm topic= | ||
+ | |difficulty= | ||
+ | |time= | ||
+ | 60-90 minutes | ||
+ | |environment= | ||
+ | Access to Internet/Web and web browser, | ||
+ | [http://xcitegroup.org/softhum/lib/exe/fetch.php?media=g:evalfossprojects.doc SIGCSE paper on evaluating FOSS projects], | ||
+ | [http://www.foss2serve.org/images/foss2serve/0/0c/Blank_Evaluation_Template.xlsx Blank evaluation template] | ||
+ | |author= | ||
+ | |source= | ||
+ | |license= | ||
+ | {{License CC BY SA}} | ||
+ | }} | ||
+ | |||
+ | === Suggestions for Open Source Community: === | ||
+ | * ''Suggestions for an open source community member who is working in conjunction with the instructor.'' | ||
+ | |||
+ | [[Category:Learning Activity]] | ||
+ | [[Category:Use and Evaluate]] | ||
+ | [[Category:Good Draft]] |
Latest revision as of 11:39, 8 September 2018
Title |
Project Evaluation Activity V1 |
---|---|
Overview |
Learners will gain an understanding of the breadth of available FOSS projects. Learners will also gain an understanding of the identifying characteristics of FOSS projects including pattern of contributions, patterns of commits, programming languages used, and more. |
Prerequisites |
Completion of Browsing a Forge Activity or understanding of SourceForge and Ohloh; Understanding of course in which students will be participating in an HFOSS project. |
Learning Objectives |
After successfully completing this activity, the learner should be able to:
|
Process Skills Practiced |
Background
This activity is intended to give you an overview of what to consider when evaluating a FOSS project for student participation and for you to gain experience using the rubric.
Directions
Part 1-Learn about the rubric
Part 2-Walk through of an evaluation of the Mifos project
- Mission Critical criteria-Viability
- Size/Scale/Complexity - An ideal project should be neither overly simple nor overly complex. One heuristic to use is the number of contributors as an indicator of project complexity.
- Go to Ohloh.net, type Mifos into the Search Projects box.
- On the results page click on Mifos (above the Mifos logo) to see the Project Summary page.
- Scroll down to the Community area and calculate the average number of contributors in the last 12 months -- note that the graph is interactive. The average was 9 so it passed the minimum average number of contributors metric of 6.
- Go to the Mifos web page (http://www.mifos.org) and choose Tech Overview from the Contributors tab. From examination of the technology stack, the architecture looks modular and further search shows it is documented elsewhere on the site.
- Result: Based on the modular design and meeting the minimum average number of contributors metric, the project is scored a 2 for size/scale/complexity.
- Activity - To support student participation a project should be reasonably active. Number of committers can be used as an indicator of activity.
- Return to the Mifos project summary page in Ohloh. Scroll to the Activity area on the page.
- Compute the 12-month average of commits. The 12-month average was about 108, much higher than the recommended range of 10 - 30 commits per month.
- Result: Because the number of commits for this project exceeds the favorable level of activity, it may be a little large/complex. However, it still appears manageable, the project is scored a 2 for activity.
- Community - A suitable project has an active user community. While it is difficult to quantitatively evaluate the activity of a user community, some indicators include a regular history of project downloads and documentation updates over time, current activity on user mailing lists, and testimonials on the project web site.
- Examine download activity
- Go to Sourceforge.net and enter Mifos into the search box.
- Choose Mifos-Microfinance Open Source from the search results.
- Click on the number of downloads that is listed on the project page.
- Change the date range to give a graph of downloads over the last year.
- Examine user mailing list activity
- Examine the IRC logs
- Result: Downloads appear steady so the project has a community of users. Developers are responsive on mailing list and have a presence on IRC. Project is scored a 3.
- Examine download activity
- Mission Critical criteria-Approachability
- Here you are evaluating a project's on-ramp to contribution, scoring as follows:
- 1-Insufficient-Few or no pointers on how to become involved.
- 2-Sufficient-Suggestions about how to get involved other than contributing money with accompanying high-level instructions.
- 3-Ideal-Obvious link to get started, list of suggestions for things to do and detailed instructions.
- Link to get started-There is a Get Started page with links to what Mifos is, how to contribute, community processes, and tools used.
- List of suggestions for things to do - The Volunteer Project page provides a list of ways to contribute including testing, translation, development and documentation. There is also a volunteer bug queue listed as a good way for developers to get started.
- Detailed instructions- On the web site instructions and information are provided in many areas including process, architecture, licensing, product functionality, and developer documentation.
- Result-Was scored a 3.
- Mission Critical criteria-Suitability
- Appropriate Artifacts -Since evaluation is dependent on class objectives, in this example we'll assume an objective is to learn the process of working in authentic development project through contributing bug fixes.
- Opportunities to contribute bug fixes - Examined the volunteer bite-sized bug queue. There were 10 open bugs for new contributors. There were many more listed for more experienced contributors.
- Documentation on how to contribute bug fixes - From the Tech Overview page there are links to details on the code submission process.
- Result - May score a 1, 2 or 3 depending on the number of bugs suitable for students to tackle and class size.
- Contributor Support-Does the project have a high volume of guidance to help students as they learn?
- Are communication tools documented?-Communication tools are documented under the Collaboration and Communication section of the Development Tools page. Instructions on how to access the mailing lists with tips on how to participate are available from the Communications page.
- Do developers have a web presence?-Examination of IRC logs shows scattered activity over the last week.
- Are operating processed documented?-Links to information about coding standards, code submission process, and commit privileges process can be found on the Tech Overview page. The process for making feature requests and for prioritizing feature request is available on the Roadmap page.
- Do questions posed have timely and supportive answers?-Responses to user mailing list and developer mailing list over the last month have timely and supportive responses.
- Result - Not a lot of activity on IRC, but mailing lists show lots of timely feedback and communication methods and operating procedures are well documented, score a 3.
- Overall evaluation for Mission Critical criteria - Since no mission-critical criteria were scored lower than a 2 the project is then evaluated on secondary criteria. Otherwise, the project would have been considered not suitable for student participation.
- Secondary criteria-Viability
- Domain
- Does this project require domain knowledge that may be difficult for students to learn? - As a domain microfinance students should be able to grasp it well enough to contribute a bug fix, which is the learning objective assumed in this example.
- Result - Score a 2 since the domain isn’t as simple to grasp as say a desktop application for word processing or compressing files.
- Maturity
- To have the organization to support student learning, the project should have at least one stable production release - The roadmap page lists releases.
- Result - The Download Mifos page says 2.6.0 is the 4th major community-supported release. Scored a 3.
- User Support
- The project should have clear instructions for downloading, installing, and using the project - There is a demo server, video, and slide presentation that explains system functionality. This information can be found looking at pages listed under the Product tab that can be used to learn about the system. There is also a user manual available. On the Download Mifos page, there are detailed instructions related to installation, configuration, system requirements, and troubleshooting.
- Result - Given the wealth of detailed documentation, score a 3.
- Roadmap
- Student learning is best supported by projects that have a roadmap that includes new feature development, a method for users to submit new feature requests and a process for identifying how new features are prioritized - The process for making feature requests and for prioritizing feature request is available on the Roadmap page. The roadmap has features that were implemented in the last release, but no listing for the next release.
- Result - Scored a 2 because there is no information listed for feature planning in the next release.
- Secondary criteria-Approachability
- Contribution Types
- Does the project contain opportunities for multiple types of contribution and of the type that fits the class? - There are multiple projects for testers, tech writers, and developers. These can be seen on the Volunteer Projects page.
- Result - May be a 1, 2 or 3 depending on whether the number of bugs is suitable for students is enough given the class size.
- Openness to Contributions
- Acceptance of a student contribution to a project provides valuable affirmation to student learning. Determine whether the project accepts student patches. - The process for contribution is documented on the Tech Overview page.
- Result - Score a 3 because the contribution process is documented.
- Student Friendliness
- Do community members moderate the tone of communication? Review mailing lists and IRC to gauge tone - Review of user mailing list and developer mailing list and IRC logs during evaluation of contributor support showed a positive tone during communication.
- Result - Score a 3, no inappropriate or demeaning messages.
- Secondary criteria-Suitability
- Project Description
- Students must be able to understand the purpose of the project. Does the project clearly describe the product? Can students understand the intended uses of the product? - The About tab on the web page has links to the vision for the product and how it is used by microfinance institutions.
- Result - Score a 3, how the product is used and the vision for it is well documented and should be understandable by students.
- Platform
- What software and hardware platform does the FOSS project run on? - Development environment can be built on Windows, Ubuntu or Mac desktop completely with FOSS software. (Project development information found here)
- Are there resources to support these platforms? - In this example, yes.
- Are students familiar with the platforms? - In this example, yes.
- Result - Score a 2, assumption in this example is students all have newer personal computers and given the ability to set up a development environment on different operating systems that makes the availability of student resources greater. However, there is some risk because machine requirements for setting up developer environment are not provided and some documentation may be out of date.
- Development Features - Is the class dependent on specific development features? (Project development information found here)
- Programming language - Is primarily Java.
- Development environment - Can be built on Windows, Ubuntu or Mac completely with FOSS software.
- Supporting technologies - Suggested IDE is Eclipse, requires Maven, Jetty, and mySQL.
- Result - Need to gauge this on knowledge of students and requirements of class. Assumption here is students know Java and are familiar with mySQL. While students are not familiar with Maven and Jetty this may not be necessary for intro bug fix plus the community is very supportive so assistance can be found there. Given there is some risk, score a 2.
- Overall evaluation for secondary criteria - Total score for criteria is over 20, the project passes. However, criteria scoring below 1 and criteria for which there was some risk noted should be reexamined to see if steps can be taken to mitigate risk.
Part 3-Evaluate a project
- Choose a project to evaluate from the project list project list
- Use the blank evaluation template to help record your results.
- Upload your evaluation.
- Add a link on your user page to the evaluation you uploaded.
Deliverables
Wiki posting of evaluation of a project from the list of HFOSS projects.
Assessment:
- How will the activity be graded?
- How will learning will be measured?
- Include sample assessment questions/rubrics.
Criteria | Level 1 (fail) | Level 2 (pass) | Level 3 (good) | Level 4 (exceptional) |
---|---|---|---|---|
The purpose of the project | ||||
Why the project is open source |
Comments
- What should the instructor know before using this activity?
- What are some likely difficulties that an instructor may encounter using this activity?
Additional Information:
ACM BoK Area & Unit(s) |
|
---|---|
ACM BoK Topic(s) |
|
Difficulty | |
Estimated Time to Complete |
60-90 minutes |
Environment / Materials |
Access to Internet/Web and web browser, SIGCSE paper on evaluating FOSS projects, Blank evaluation template |
Author(s) | |
Source | |
License |
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License |
Suggestions for Open Source Community:
- Suggestions for an open source community member who is working in conjunction with the instructor.