Friday, November 18, 2016

Planning the next semester

POSSE workshop, Friday Nov 18, Exercise for planning the upcoming semester.

1) First three things I want to accomplish in my course (491a in Spring 2017)

  • Teach github version control and issue tracking (intro and use tools in lab)
  • Reengineering (and maybe later) refactoring requirements and design documents from roadmap, documentation, and issue tracker (lead, Nanette interested in this)
  • Contributing to code and/or documentation in repository (plenty of documentation on this in foss2serve wiki)

2) Three things I need to do at POSSE to continue working when I get home
  • Find sparring partner
  • Select a project
  • Define a plan, milestones and action items
3) Next three things I need to do when I get home

  • In detail description of required student preparation and the activities (as in pogil.org)
  • Email class with detailed preparation plan
  • Set up surveys (reuse existing standard to contribute to existing data collection, lead by Grant Braught)

Friday, November 4, 2016

FOSS@CSULB Course Preparation

FOSS in Courses 2

Background

This activity is an extension of the FOSS in Courses Planning 1 activity. The goal of this activity is to have
a 1-3 topics and/or learning ideas to bring for discussion to the workshop and receive feedback on.
We do not expect that these activities/ideas will be completely fleshed out, but to provide an initial starting
point to think about.
Here are ways that I can use FOSS activities:
  • Lectures: FOSS examples, FOSS as a hands-on experience
  • In-class activity: introductory POSSE assignments (see below)
  • Homework: solving an introductory issue
  • Stream of related activities: set up github, learn bug tracking, install a FOSS project
  • Project: solving a set of related issues

*Questions for myself*: How to include the software lifecycle here - or better into CECS 343/543/542.
Take ideas from Heidie's course on that.
For design, what can I come up with? Documentation assignments in 343 to understand.
This was a major take-away for many students in my 343s, because they do little projects in other courses.
For testing?
For quality assurance?
Documentation is easier... there are lots of tasks there.

Revised activities

"Reengineering a Software Requirements Specification"

Description: Reengineering a SRS from a roadmap and issue database and coming up with a set of questions
on what the functionality should be and what is unclear, then ask community.

Learning outcomes: How to write a requirements specification, how to analyze an issue database and
what to critically ask for making sure that a SRS is sufficiently complete.

Pre-requisite knowledge: Content model of a SRS, requirements notation techniques, how to read issue databases
Time required: for instructor prep (1h), for student completion (3h) and elapsed calendar time (2 weeks incl. feedback)
Input required from the HFOSS community: (optional) feedback on specs and list of questions.
Contribution and usefulness: Requirements documentation in case there wasn't any.
Assessment/grading approach: Base on SRS & list of questions, team activity, (optional) co-rating of spec by community (?)
Questions or concerns I have: Does the community do things like that? How do I select a useful format of SRS?
Would 5 specs of the same system be useful??

Stumbling blocks or barriers: None so far.

"Understanding Design activity"

To-Be-DescriptionUnderstanding the existing design? Not much to be graded, unless I quiz them on it... 
Or let them first understand it and then reengineer a SDS? Are they not existent in documentation yet?
Learning outcomes: How to analyze and assess design, hot to develop or reengineer a design spec.
Pre-requisite knowledge: Content model of a SDS, design notation techniques, how to read documentation
Time required: for instructor prep (1h), for student completion (3h) and elapsed calendar time (2 weeks incl. feedback)
Input required from the HFOSS community: (Optional) feedback on specs and list of questions.
Contribution and usefulness: Design documentation in case there wasn't any.
Assessment/grading approach: Base on SDS & assessment, team activity (assessment could be individual), co-rating of spec by community (?)
Questions or concerns I have: Does the community do things like that? How do I select a useful format of SDS?
Would 5 specs of the same system be useful??
Stumbling blocks or barriers: None so far.


"Quality Assurance activity"

DescriptionGive students a checklist of what to look for: Code documentation, version control documentation,
structure and modularization, "clean-ness" of interfaces, etc.
Learning outcomes: How to analyze a an implementation and a code repository.
Pre-requisite knowledge: Code repositories, some coding skills (enough to read & understand code)
Time required: for instructor prep (1h), for student completion (3h) and elapsed calendar time (1 week)
Input required from the HFOSS community: None.
Contribution and usefulness: None.
Assessment/grading approach: Assessment, individual activity
Questions or concerns I have: None.
Stumbling blocks or barriers: None so far.

Bug tracking

Bug Tracker Activity

Background:

Bug tracking systems are a form of change management and organization used by FOSS projects.
Bug trackers do far more than simply keep track of bugs. They also are used to hold new feature requests,
patches, and some tasks. Bug trackers are also called request trackers, issue trackers, request trackers and
ticket systems. Please read the two readings below for a more complete treatment of bug trackers and their use
in FOSS projects.

Directions:

We will begin by looking at a typical Bugzilla instance for a project. We will be using GNOME's Bugzilla instance,
but specifically looking at the bugs for the Accessibility Team.

Part 1 - Bug Reports

  1. Open a browser and go to the GNOME Accessibility Bugs
  2. Define what each of the column names below indicate. Include the range of possible values for 2-7 below.
    Feel free to explore beyond the page to find more information.
    1. ID: unique identifier for each bug
    2. Sev: severity - how bad is it? From blocker ("application unusable") to trivial ("minor cosmetic issue")
    3. Pri: priority - how important is this to resolve? Assigned by developer.
    4. OS: Operating system with which the bug occurred
    5. Product: where the bug is observed.
    6. Status: How far along in the process is it?
      [UNCONFIRMED, NEW, ASSIGNED, REOPENED, NEEDINFO]
    7. Resolution: How was it solved? What may help for resolving it?
    8. Summary: of the issue in one sentence.
  3. Describe how you discovered the definitions and how did you find the information from above
    (hint: the advanced search shows the options or the Reports link has a link)? Previous knowledge.
  4. Identify the order in which the bugs are initially displayed? According to Status.
  5. What is the meaning of the shading of some bug reports? Shaded ones are less critical or minor
  6. What is the meaning of the colors used when describing a bug (red, gray, black)?
    Red is critical, black normal, grey enhancement
  7. Select a bug that you think that you might be able to fix and look at it more closely - 385900 
    1. Identify when the bug was submitted. 2006-12-14
    2. Identify if there has been recent discussion about the bug? Not since 2011
    3. Is the bug current? Yes, seems to be according to status.
    4. Is the bug assigned? To whom? To a role (Panel Maintainer) but not an individual developer.
    5. Describe what you would need to do to fix the bug. Find the document or piece of code where this
      originates - nobody has been able to identify it so far.
  8. Repeat the previous step with a different kind of bug. Did it for a couple more, nothing out of the ordinary.

Part 2 - Collective Reports

  1. Click on the “Reports” link on the top of the page.
  2. Click on the "Summary of Bug Activity for the last week".
  3. How many bug reports were opened in the last week? How many were closed? 18 opened, 13 closed.
  4. What was the general trend last week? Were more bugs opened than closed or vice versa? 5 more opened.
  5. Who were the top three bug closers? Why is this important to know? Sebastian, Michael, and Milan.
    They are probably product champions and likely to be able to help finding ways to resolve more of them.
  6. Who were the top three bug reporters? Are these the same as the top three bug closes?
    What is the overlap in these two lists? Vrishab, Mohammed, and Sebastian. Yes, the latter one.
  7. Who are the top three contributors of patches? Philip, Carlos, and Bastien.
  8. Who are the top three reviewers of patches?
    What is the overlap between these lists and the bug closers and bug reporters?
    What is the overlap between patch contributors and patch reviewers?
    Sebastian, Florian, Milan. Only one overlap.
  9. Click on the “Generate Graphical Reports” link.
  10. Plot a line graph of the severity of bugs by component for Orca:
    1. Select "Severity" for the vertical axis
    2. Select "Component" for the horizontal axis
    3. Select "Bar Graph" for type of graph
    4. Leave the "Multiple Images" as <none>
    5. Scroll down and select Orca from the Product menu.
    6. Click "Generate Report".
  11. What class were the majority of the bugs for braille? Normal.
  12. What other reports can you generate? Tabular reports and change over time.