CS2103/T Software Engineering

A balanced, iterative, and brown-field introduction to Software Engineering...

CS2103/T is an introductory Software Engineering module. It covers roughly a 50-50 balance of basic SE theory and practice that a student needs to know before going for SE internships in the industry or taking higher-level project modules. The module follows an iterative approach of going increasingly deeper into SE by exposing students to a series of increasingly bigger software projects. The module is notable as one of the rare SE modules that takes a brown-field approach to teaching SE.

On the theory side, this module is supported by a customized online textbook Software Engineering for Self-Directed Learners, integrated into this module website.

The practice side of this module is mainly covered by a team project. Students are expected to take over an existing project AddressBook-Level4 (AB4) -- a relatively small yet non-trivial (10 KLoC) generic product -- and enhance it into a better product or evolve it into a different product. To help students to tackle the learning curve of working with 10 KLoC of code, the module takes them through a series of projects of increasing size, from AddressBook-Level1(1 KLoC) to AddressBook-Level3(4 KLoC).

Given below is a summary of what the module covers and does not cover (i.e., unticked items).

Topic Covered Not covered
Java Used heavily, but not taught syntax (reason: expected prerequisite knowledge)
OOP Used in a non-trivial project, intermediate OOP principles basics (reason: expected prerequisite knowledge)
SE tools/practices those typically used in a mature, high-rigor SE project those specific to start-ups
Modeling Some UML notations (sufficient to be able to describe SE artifacts using models, such as seen in this Developer Guide of AB4) intensive upfront design modeling
Requirements Some lightweight techniques to gather and document project requirements rapid prototyping, heavy UI design, designing a product from scratch
Documentation Documentation targeting end users (example) as well as those targeting developers (example) Marketing materials
Project Management Iterative delivery of a product, Working collaboratively with team members, on-site as well as remotely Setting up project infrastructure from scratch
Testing basic developer testing and user testing testing for non-functional aspects
Applications domains Cross-platform desktop applications Web programming, Mobile programming, Database programming



Using this Website

The Schedule pageSchedule page is your main source of information for CS2103/T. You will need to refer to it weekly.

More details for the upcoming weeks will be added as the weeks progress. In general, information given for more than 1 week into the future should be treated as tentative.

💡 For those who don't like the nested style used by this website, we have also provided flat version of the the website. You can switch between the two versions using the top navigation bar of the website.

Browser Compatibility

Most of this will work on most mainstream Browsers, but embedded slides are best viewed using Chrome.

Information Layers

This book tries to layer information so that readers can decide to omit less important layers if they wish to.

More important information are in bold or highlighted while less important information are dimmed or in collapsed panels such as the below.

Less important info

Less important info

Less important info

Tabs indicate alternative formats of the same content (e.g. video vs text). You can choose the one you like and ignore the other tabs.

    Some textual description of X

 

Video describing X

Dotted underlines indicate tool tips (activated by hovering over it) and dashed underlines indicate modal windows (activated by clicking) containing additional information.

Additional information
Additional information

This website uses a star rating system to indicate the priority level of contents.

Relevant: [Admin Module Expectations → Star Rating System ]

 

Star rating system

Start with things that are rated one-star and progress to things with more stars. Things rated four stars are optional.

Star ratings for Learning Outcomes (and textbook sections):

  • One-star LOs : The LOs you need to achieve just to keep up with the module. We recommend you to achieve these LOs if you want to pass the module (i.e. up to a C grade).

  • Two-stars LOs : Can get you up to a B+.

  • Three-stars LOs : Can get you up to an A.

  • Four-stars LOs : Can be useful for getting an A+, tutors positions, and getting into downstream SE modules that have competitive entry requirements (e.g., CS3281&2, CS3217, CS3216). Four-star LOs are not examinable. Omitting them will not affect your CAP (as A+ has the same CAP as an A grade)

  • LOs marked with two icons e.g., : , : , : , : are relevant LOs you are expected have achieved in prerequisite modules. They are given for reference, but are examinable. The number of stars indicate the progression of topics, similar to the star rating system above i.e., one-star prerequisite LOs are the most basic and the most important. four-star pre-requisite LOs can be ignored without affecting CAP.

Star ratings for other things e.g., admin info sections:

  • The module uses a similar star rating system to indicate the importance of other info in this website. i.e., information rated as one-star are the most essential. Info rated four stars are non-essential and can be ignored without affecting your ability to follow the module.

Conventions Used

Shorthand Headings

Meaning of some shortened headings:

  • What : the meaning of the concept in concern

  • Why : the motivation behind the concept in concern

  • How : the usage of the concept in concern

  • When : the pros and cons of the concept in concern, when to use the concept

Boxed-Text Styles

additional info warning positive message important message an error to avoid tip definition

Meaning of Icons

tangential : tangential info, can be ignored if not interested
: direct link to the LO. Ctrl+Click to open the LO in new window/tab.
: learning outcomes
: prerequisite learning outcome
: examples
: resources
: exercises
: printable version
: preview/more info
: video
>_ : a command to be run in a terminal
: textual description
: slides
: output produced by running code
question without answer
question with answer

: tasks to do
: lecture
: tutorial
: evidence you can use to prove you have achieved a learning outcome
⏰ : deadline

Searching for keywords

Use the search box in the top navigation bar to search for keywords in the website pages. If you cannot find the content related to a keyword, let us know by posting in the website issue tracker so that we can add the missing keyword to our search index.

Saving as PDF Files

  1. Use Chrome to load the page you want to save as pdf.

  2. Click on the Print option in Chrome’s menu.

  3. Set the destination to Save as PDF, then click Save to save a copy of the file in PDF format. For best results, use the settings indicated in the screenshot below.

Printing Textbook Content

Printer-friendly version (indicated by icon) have been provided for each chapter and the whole book. You can use them for saving as pdf files or printing.

Making this Website Better

This website was generated using the MarkBind software developed at NUS. We welcome bug reports, suggestions, and contributions, to be submitted at the website issue tracker.



Module Expectations

Prior Knowledge: Java and OOP

This module requires you to write Java code almost every week, starting from the very first week. If your Java skills are shaky, do brush up your Java programming skills.

In particular, you may want to have a look at the new Java 8 features such as streams, lambdas, Optionals, that may not have been covered in previous Java modules.

CS2103 students: This module assumes a reasonable prior knowledge of Java and OOP because most students taking this module have taken two Java modules before. If you are totally new to Java, you may be better off switching to CS2113 (Software Engineering & Object-Oriented Programming) instead.

Workload

Given 60% of this module is based on CA, it can appear to be heavy. However, it is not expected that you will spend more time on this module than its peer modules (e.g., if this module is core for you, it should not take more time than other level 2 core modules in your program).

  • Note that the module contains more things than a typical students can do, in order to provide enough things for even the strongest students to learn as much as they wish to.
  • This means it is perfectly OK if you don't have time to learn everything the module offers. Control your workload based on time you spend for the module in a week e.g., 1-1.5 days per week.
  • We have provided a star rating system to guide you when prioritizing which things to do.

Star rating system

Start with things that are rated one-star and progress to things with more stars. Things rated four stars are optional.

Star ratings for Learning Outcomes (and textbook sections):

  • One-star LOs : The LOs you need to achieve just to keep up with the module. We recommend you to achieve these LOs if you want to pass the module (i.e. up to a C grade).

  • Two-stars LOs : Can get you up to a B+.

  • Three-stars LOs : Can get you up to an A.

  • Four-stars LOs : Can be useful for getting an A+, tutors positions, and getting into downstream SE modules that have competitive entry requirements (e.g., CS3281&2, CS3217, CS3216). Four-star LOs are not examinable. Omitting them will not affect your CAP (as A+ has the same CAP as an A grade)

  • LOs marked with two icons e.g., : , : , : , : are relevant LOs you are expected have achieved in prerequisite modules. They are given for reference, but are examinable. The number of stars indicate the progression of topics, similar to the star rating system above i.e., one-star prerequisite LOs are the most basic and the most important. four-star pre-requisite LOs can be ignored without affecting CAP.

Star ratings for other things e.g., admin info sections:

  • The module uses a similar star rating system to indicate the importance of other info in this website. i.e., information rated as one-star are the most essential. Info rated four stars are non-essential and can be ignored without affecting your ability to follow the module.


Module Structure


Weekly Schedule

📆 [Friday (previous week)]

Attend the lecture for,

  • a recap of the preceding week's Learning Outcomes (LOs)
  • an introduction to the current week's LOs

Relevant: [Admin Lectures ]

 

Timing/venue:

Semester Venue Time
Semester 1 (Aug-Nov) ICube Auditorium 1600-1800
Semester 2 (Jan-April) ICube Auditorium 1600-1800

Lectures start on time sharp and end around 15 minutes before official end time.

CS2103T lectures are same as that for CS2103. Please ignore the CS2101 session scheduled at the same slot. That is a dummy slot used to work around a limitation on the CORS IT system. The system doesn't allow lectures of two modules to be scheduled in the same venue at the same time.

Attendance: Attendance for the first lecture is compulsory.

Webcast: All lectures will be webcast. However, some things are not captured well in the webcast recording. You are advised to treat the webcast as a 'backup' for you to catch up anything missed during the lecture. Webcast lectures will be available on LumiNUS instead fo IVLE (IVLE no longer supports webcasts).

Handouts: There are no handouts. All learning materials are organized around learning outcomes (not lectures or topics), are given in Web format, and can be found in the Textbook section and are also hyperlinked from the Schedule pageSchedule page.

Slides: Our lecture slides are not suited for printing or using as a reference during the lecture/exams. They are only an aid for lecture delivery. Slides will be uploaded to IVLE after the lecture.

📆 [Saturday (previous week) - Tuesday]

  • Use the relevant learning resources to achieve the LOs.
  • Self-test your knowledge using exercises given in the learning resources.
  • If you don't have time to achieve all LOs assigned to the week, use the star rating system to decide which ones to do first.

Relevant: [Admin Learning Outcomes ]

 

This module is organized primarily around a list of Learning Outcomes.

Each week has a suggested list of LOs. They are categorized using a star-rating system.

Relevant: [Admin Module Expectations → Star Rating System ]

 

Star rating system

Start with things that are rated one-star and progress to things with more stars. Things rated four stars are optional.

Star ratings for Learning Outcomes (and textbook sections):

  • One-star LOs : The LOs you need to achieve just to keep up with the module. We recommend you to achieve these LOs if you want to pass the module (i.e. up to a C grade).

  • Two-stars LOs : Can get you up to a B+.

  • Three-stars LOs : Can get you up to an A.

  • Four-stars LOs : Can be useful for getting an A+, tutors positions, and getting into downstream SE modules that have competitive entry requirements (e.g., CS3281&2, CS3217, CS3216). Four-star LOs are not examinable. Omitting them will not affect your CAP (as A+ has the same CAP as an A grade)

  • LOs marked with two icons e.g., : , : , : , : are relevant LOs you are expected have achieved in prerequisite modules. They are given for reference, but are examinable. The number of stars indicate the progression of topics, similar to the star rating system above i.e., one-star prerequisite LOs are the most basic and the most important. four-star pre-requisite LOs can be ignored without affecting CAP.

Star ratings for other things e.g., admin info sections:

  • The module uses a similar star rating system to indicate the importance of other info in this website. i.e., information rated as one-star are the most essential. Info rated four stars are non-essential and can be ignored without affecting your ability to follow the module.

📆 [Wednesday - Friday]

Attend the tutorial to,

  • demonstrate evidence of your achieving weekly LOs to the tutor
  • learn from peer demos of their own LO evidence

Relevant: [Admin Tutorials ]

 

Tutorial Timetable

Our tutorials start on week 2 (even before CORS tutorial bidding is over), not in week 3 as other modules do. CS2103 (not CS2103T) students need to choose a temporary tutorial slot for week 2 tutorial. We'll inform you the procedure to do so in due course.

Our tutorial IDs are different from CORS. Format: W09 means Wednesday 0900 and so on.

Module Tutorial ID (ID in CORS) Time Venue Tutors (contact details)
CS2103 W10 (T01) Wed 1000 COM1-B103 (ALL)* TBD
CS2103T W12 (T01) Wed 1200 COM1-0210 (SR10) TBD
CS2103 W13 (T02) Wed 1300 COM1-0210 (SR10) TBD
CS2103T W14 (T02) Wed 1400 COM1-0210 (SR10) TBD
CS2103T W16 (T03) Wed 1600 COM1-B103 (ALL) TBD
CS2103T W17 (T04) Wed 1700 COM1-B103 (ALL) TBD
CS2103T T09 (T06) Thu 0900 COM1-0210 (SR10) TBD
CS2103 T10 (T04) Thu 1000 COM1-0210 (SR10) TBD
CS2103T T12 (T07) Thu 1200 COM1-0210 (SR10) TBD
CS2103 T13 (T06) Thu 1300 COM1-0210 (SR10) TBD
CS2103T T16 (T08) Thu 1600 COM1-0210 (SR10) TBD
CS2103T F10 (T10) Fri 1000 COM1-0210 (SR10) TBD
CS2103 F11 (T09) Fri 1100 COM1-0210 (SR10) TBD

*ALL: Active Learning Room

What happens during the tutorial:

  • A tutorial group is handled by two tutors. Each tutor will work with two teams.
  • The tutor will direct students to share/discuss evidence of achieving the weekly learning outcomes (LO).
  • If some students have met with difficulties while achieving an LO, the tutor can direct those students to get help from those who have achieved the LO. The number of LOs that can be covered in the tutorial session depends on how well-prepared you are.
  • The tutor will observe, and give feedback on how well you are achieving required LOs.
  • Please bring your laptop to tutorials. You often need it to show evidence of LOs you achieved. At other times, we ask you to work on project related things with your team members, which too may require the laptop.

Relevant: [Admin Appendix C(FAQ): What if I don't carry around a laptop? ]

 

What if I don’t carry around a laptop?

If you do not have a laptop or prefer not to bring the laptop, it is up to you to show your work to the tutor in some way (e.g. by connecting to your home PC remotely), without requiring extra time/effort from the tutor or team members.

Reason: As you enjoy the benefits of not bring the laptop; you (not others) should bear the cost too.


The role of our tutors is different from tutors in other modules.

  • No direct tech help: Tutors are prohibited from giving technical help. Rationale: We want you to learn the vital survival skill of troubleshooting technical problems.

Relevant: [Admin Appendix D: How to get Help in CS2103/T ]

 

This guide is mostly about getting tech help, but it also applies to getting clarifications on module topics too. e.g. what is the difference between refactoring and rewriting?


We want to move you away from 'hand holding' and make you learn how to solve problems on your own. This is a vital survival skill in the industry and it needs practice.

Whether it is a technical problem (e.g. error when using the IDE) or a doubt about a concept (e.g. what is the difference between scripted testing and exploratory testing?)  the teaching team is happy to work with you when you look for a solution/answer, but we do not do it for you. We discourage unconditional direct help from tutors because we want you to learn to help yourself. Yes, we believe in ‘tough love’😝.

The question you should always ask yourself is, 'how do I solve this problem if the lecturer/tutors are not around to help me?'


What not to do:

  • When faced with a technical problem or a doubt about a concept, don't fire off an email lecturer/tutor immediately, unless it is something only the lecturer/tutor is supposed to know.

What to do:

  • Check what is given: Check if the problem/concept has been discussed in the lectures, textbook, or the list of resources given to you. Yes, it is easier for you to write an email to the tutor/lecturer instead, but that shouldn't be your default behavior. We know that sometimes it is difficult to find stuff in the resources we have provided. But you should try first.

  • Search: It is very likely the answer already exists somewhere in the cyberspace. Almost every programming-related question has been answered in places like stackoverflow. Don't give an opportunity for someone to ask you to STFW.
    Pay attention to the error message you encounter. Sometimes it also contains hints as to how to fix the problem. Even if not, a web search on the error message is a good starting point.  

  • Ask peers:

    Ask your team members.

    Ask classmates using the module forum or the slack channel. Even if you figured out one way to solve a problem, discussing it on a public forum might lead you to better ways of solving it, and will help other classmates who are facing similar problems too. If you are really shy to ask questions in the forum, you may use this form to submit your question anonymously which we will then post in the forum.


    Rubber duck debugging is an informal term used in software engineering to refer to a method of debugging code. The name is a reference to a story in the book The Pragmatic Programmer in which a programmer would carry around a rubber duck and debug his code by forcing himself to explain it, line-by-line, to the duck.

    [for more, see wikipedia entry]

  • Ask the world using programming forums such as stackoverflow.

    Here are some tips for posting help request:

    • PLEASE search for existing answers before you post your question in those public forums; You don't want to appear as a 'clueless' or 'too lazy to do your research' person in a public forum.

    • Learn to isolate the problem. "My code doesn't work" isn't going to help even if you post the whole code online. Others don't have time to go through all of your code. Isolate the part that doesn't work and strip it down to the bare minimum that is enough reproduce the error. Sometimes, this process actually helps you to figure out the problem yourself. If not, at least it increases the chance of someone else being able to help you.

      💡 How to isolate problematic code? Delete code (one bit at a time) that is confirmed as not related to the problem. Do that until you can still reproduce the problem with the least amount of code remaining.

    • Generalize the problem. "How to write tasks to a text file using Java" is too specific to what you are working on. You are more likely to find help if you post a thread called (or search for) "How to write to a file using Java".

    • Explain well. Conversations via online forums take time. If you post everything that is relevant to your problem, your chances of getting an answer in the first try is higher. If others have to ask you more questions before they can help you, it will take longer. But this doesn't mean you dump too much information into the thread either.

      💡 Know what these stand for: RTFM, STFW, GIYF

  • Raise your question during a tutorial. Some questions can be discussed with the tutor and tutorial-mates. What kind of questions are suitable to discuss with the tutor? Consider these two questions you might want to ask a tutor:
    • Good This is how I understood/applied coupling. Is that correct? - Such questions are welcome. Reason:This question shows you have put in some effort to learn the topic and seeking further clarification from the tutor.
    • Bad What is coupling? - Such questions are discouraged. Reason: This question implies you haven’t done what you could to learn the topic in concern.
  • Talk to the lecturer before or after the lecture. The lecturer will be at the lecture venue from 30 minutes before the start of the lecture.

  • Request our help: Failing all above, you can always request for help by emailing the lecturer.

Resources


  • No ‘teaching’: Tutors are prohibited from “teaching” concepts that are covered in lectures or other learning resources given to you. Self-learning is a vital part of the module. But of course tutors can help you clarify doubts under the right circumstances.

Relevant: [Admin Appendix D (extract): Questions suitable for tutor ]

 
  • Raise your question during a tutorial. Some questions can be discussed with the tutor and tutorial-mates. What kind of questions are suitable to discuss with the tutor? Consider these two questions you might want to ask a tutor:
    • Good This is how I understood/applied coupling. Is that correct? - Such questions are welcome. Reason:This question shows you have put in some effort to learn the topic and seeking further clarification from the tutor.
    • Bad What is coupling? - Such questions are discouraged. Reason: This question implies you haven’t done what you could to learn the topic in concern.


  • No leading from the front: Tutors are not expected to lead your project effort. They will not tell you how to do project tasks or when to do project tasks. You have to figure those out yourselves. But tutors will give you feedback on how you are doing (or have done) project tasks so that you can improve further.

Timing/venue:

  • Please refer to the Schedule pageSchedule page for further details on each tutorial.
  • You are expected to arrive on time. Punctuality is considered for participation marks.
  • You may leave the class 15 minutes before the hour if you have another class right after. There is no need to wait till the tutor dismisses you. However, inform the tutor (as a courtesy) before leaving if you leave before the class is dismissed.
  • Please make sure you vacate the table 5 minutes before the hour so that the next group can start on time.
  • In the past many students have suggested to increase the tutorial duration because 1 hour is barely enough to get through all weekly LOs. Increasing the tutorial time is not possible due to lack of venues and tutors. Instead, let's try to make the best of the one hour available by coming well prepared and starting on time.

Grading:

Tutorials are not graded. However, your conduct will be reviewed by team members and the tutor which will determine your participation marks.

[Exchange students only] Registering for tutorials:

  • Exchange students need to use the ORATUT system to register for the tutorials. You must have received the instructions from UG office on how/when to go about the registration process. If not, please talk to UG office. When we can see your appeal on ORATUT, we can allocate you to the tutorial slot.



Learning Outcomes

This module is organized primarily around a list of Learning Outcomes.

Each week has a suggested list of LOs. They are categorized using a star-rating system.

Relevant: [Admin Module Expectations → Star Rating System ]

 

Star rating system

Start with things that are rated one-star and progress to things with more stars. Things rated four stars are optional.

Star ratings for Learning Outcomes (and textbook sections):

  • One-star LOs : The LOs you need to achieve just to keep up with the module. We recommend you to achieve these LOs if you want to pass the module (i.e. up to a C grade).

  • Two-stars LOs : Can get you up to a B+.

  • Three-stars LOs : Can get you up to an A.

  • Four-stars LOs : Can be useful for getting an A+, tutors positions, and getting into downstream SE modules that have competitive entry requirements (e.g., CS3281&2, CS3217, CS3216). Four-star LOs are not examinable. Omitting them will not affect your CAP (as A+ has the same CAP as an A grade)

  • LOs marked with two icons e.g., : , : , : , : are relevant LOs you are expected have achieved in prerequisite modules. They are given for reference, but are examinable. The number of stars indicate the progression of topics, similar to the star rating system above i.e., one-star prerequisite LOs are the most basic and the most important. four-star pre-requisite LOs can be ignored without affecting CAP.

Star ratings for other things e.g., admin info sections:

  • The module uses a similar star rating system to indicate the importance of other info in this website. i.e., information rated as one-star are the most essential. Info rated four stars are non-essential and can be ignored without affecting your ability to follow the module.



Lectures

Timing/venue:

Semester Venue Time
Semester 1 (Aug-Nov) ICube Auditorium 1600-1800
Semester 2 (Jan-April) ICube Auditorium 1600-1800

Lectures start on time sharp and end around 15 minutes before official end time.

CS2103T lectures are same as that for CS2103. Please ignore the CS2101 session scheduled at the same slot. That is a dummy slot used to work around a limitation on the CORS IT system. The system doesn't allow lectures of two modules to be scheduled in the same venue at the same time.

Attendance: Attendance for the first lecture is compulsory.

Webcast: All lectures will be webcast. However, some things are not captured well in the webcast recording. You are advised to treat the webcast as a 'backup' for you to catch up anything missed during the lecture. Webcast lectures will be available on LumiNUS instead fo IVLE (IVLE no longer supports webcasts).

Handouts: There are no handouts. All learning materials are organized around learning outcomes (not lectures or topics), are given in Web format, and can be found in the Textbook section and are also hyperlinked from the Schedule pageSchedule page.

Slides: Our lecture slides are not suited for printing or using as a reference during the lecture/exams. They are only an aid for lecture delivery. Slides will be uploaded to IVLE after the lecture.



Tutorials #tutor role #times #venue

Tutorial Timetable

Our tutorials start on week 2 (even before CORS tutorial bidding is over), not in week 3 as other modules do. CS2103 (not CS2103T) students need to choose a temporary tutorial slot for week 2 tutorial. We'll inform you the procedure to do so in due course.

Our tutorial IDs are different from CORS. Format: W09 means Wednesday 0900 and so on.

Module Tutorial ID (ID in CORS) Time Venue Tutors (contact details)
CS2103 W10 (T01) Wed 1000 COM1-B103 (ALL)* TBD
CS2103T W12 (T01) Wed 1200 COM1-0210 (SR10) TBD
CS2103 W13 (T02) Wed 1300 COM1-0210 (SR10) TBD
CS2103T W14 (T02) Wed 1400 COM1-0210 (SR10) TBD
CS2103T W16 (T03) Wed 1600 COM1-B103 (ALL) TBD
CS2103T W17 (T04) Wed 1700 COM1-B103 (ALL) TBD
CS2103T T09 (T06) Thu 0900 COM1-0210 (SR10) TBD
CS2103 T10 (T04) Thu 1000 COM1-0210 (SR10) TBD
CS2103T T12 (T07) Thu 1200 COM1-0210 (SR10) TBD
CS2103 T13 (T06) Thu 1300 COM1-0210 (SR10) TBD
CS2103T T16 (T08) Thu 1600 COM1-0210 (SR10) TBD
CS2103T F10 (T10) Fri 1000 COM1-0210 (SR10) TBD
CS2103 F11 (T09) Fri 1100 COM1-0210 (SR10) TBD

*ALL: Active Learning Room

What happens during the tutorial:

  • A tutorial group is handled by two tutors. Each tutor will work with two teams.
  • The tutor will direct students to share/discuss evidence of achieving the weekly learning outcomes (LO).
  • If some students have met with difficulties while achieving an LO, the tutor can direct those students to get help from those who have achieved the LO. The number of LOs that can be covered in the tutorial session depends on how well-prepared you are.
  • The tutor will observe, and give feedback on how well you are achieving required LOs.
  • Please bring your laptop to tutorials. You often need it to show evidence of LOs you achieved. At other times, we ask you to work on project related things with your team members, which too may require the laptop.

Relevant: [Admin Appendix C(FAQ): What if I don't carry around a laptop? ]

 

What if I don’t carry around a laptop?

If you do not have a laptop or prefer not to bring the laptop, it is up to you to show your work to the tutor in some way (e.g. by connecting to your home PC remotely), without requiring extra time/effort from the tutor or team members.

Reason: As you enjoy the benefits of not bring the laptop; you (not others) should bear the cost too.


The role of our tutors is different from tutors in other modules.

  • No direct tech help: Tutors are prohibited from giving technical help. Rationale: We want you to learn the vital survival skill of troubleshooting technical problems.

Relevant: [Admin Appendix D: How to get Help in CS2103/T ]

 

This guide is mostly about getting tech help, but it also applies to getting clarifications on module topics too. e.g. what is the difference between refactoring and rewriting?


We want to move you away from 'hand holding' and make you learn how to solve problems on your own. This is a vital survival skill in the industry and it needs practice.

Whether it is a technical problem (e.g. error when using the IDE) or a doubt about a concept (e.g. what is the difference between scripted testing and exploratory testing?)  the teaching team is happy to work with you when you look for a solution/answer, but we do not do it for you. We discourage unconditional direct help from tutors because we want you to learn to help yourself. Yes, we believe in ‘tough love’😝.

The question you should always ask yourself is, 'how do I solve this problem if the lecturer/tutors are not around to help me?'


What not to do:

  • When faced with a technical problem or a doubt about a concept, don't fire off an email lecturer/tutor immediately, unless it is something only the lecturer/tutor is supposed to know.

What to do:

  • Check what is given: Check if the problem/concept has been discussed in the lectures, textbook, or the list of resources given to you. Yes, it is easier for you to write an email to the tutor/lecturer instead, but that shouldn't be your default behavior. We know that sometimes it is difficult to find stuff in the resources we have provided. But you should try first.

  • Search: It is very likely the answer already exists somewhere in the cyberspace. Almost every programming-related question has been answered in places like stackoverflow. Don't give an opportunity for someone to ask you to STFW.
    Pay attention to the error message you encounter. Sometimes it also contains hints as to how to fix the problem. Even if not, a web search on the error message is a good starting point.  

  • Ask peers:

    Ask your team members.

    Ask classmates using the module forum or the slack channel. Even if you figured out one way to solve a problem, discussing it on a public forum might lead you to better ways of solving it, and will help other classmates who are facing similar problems too. If you are really shy to ask questions in the forum, you may use this form to submit your question anonymously which we will then post in the forum.


    Rubber duck debugging is an informal term used in software engineering to refer to a method of debugging code. The name is a reference to a story in the book The Pragmatic Programmer in which a programmer would carry around a rubber duck and debug his code by forcing himself to explain it, line-by-line, to the duck.

    [for more, see wikipedia entry]

  • Ask the world using programming forums such as stackoverflow.

    Here are some tips for posting help request:

    • PLEASE search for existing answers before you post your question in those public forums; You don't want to appear as a 'clueless' or 'too lazy to do your research' person in a public forum.

    • Learn to isolate the problem. "My code doesn't work" isn't going to help even if you post the whole code online. Others don't have time to go through all of your code. Isolate the part that doesn't work and strip it down to the bare minimum that is enough reproduce the error. Sometimes, this process actually helps you to figure out the problem yourself. If not, at least it increases the chance of someone else being able to help you.

      💡 How to isolate problematic code? Delete code (one bit at a time) that is confirmed as not related to the problem. Do that until you can still reproduce the problem with the least amount of code remaining.

    • Generalize the problem. "How to write tasks to a text file using Java" is too specific to what you are working on. You are more likely to find help if you post a thread called (or search for) "How to write to a file using Java".

    • Explain well. Conversations via online forums take time. If you post everything that is relevant to your problem, your chances of getting an answer in the first try is higher. If others have to ask you more questions before they can help you, it will take longer. But this doesn't mean you dump too much information into the thread either.

      💡 Know what these stand for: RTFM, STFW, GIYF

  • Raise your question during a tutorial. Some questions can be discussed with the tutor and tutorial-mates. What kind of questions are suitable to discuss with the tutor? Consider these two questions you might want to ask a tutor:
    • Good This is how I understood/applied coupling. Is that correct? - Such questions are welcome. Reason:This question shows you have put in some effort to learn the topic and seeking further clarification from the tutor.
    • Bad What is coupling? - Such questions are discouraged. Reason: This question implies you haven’t done what you could to learn the topic in concern.
  • Talk to the lecturer before or after the lecture. The lecturer will be at the lecture venue from 30 minutes before the start of the lecture.

  • Request our help: Failing all above, you can always request for help by emailing the lecturer.

Resources


  • No ‘teaching’: Tutors are prohibited from “teaching” concepts that are covered in lectures or other learning resources given to you. Self-learning is a vital part of the module. But of course tutors can help you clarify doubts under the right circumstances.

Relevant: [Admin Appendix D (extract): Questions suitable for tutor ]

 
  • Raise your question during a tutorial. Some questions can be discussed with the tutor and tutorial-mates. What kind of questions are suitable to discuss with the tutor? Consider these two questions you might want to ask a tutor:
    • Good This is how I understood/applied coupling. Is that correct? - Such questions are welcome. Reason:This question shows you have put in some effort to learn the topic and seeking further clarification from the tutor.
    • Bad What is coupling? - Such questions are discouraged. Reason: This question implies you haven’t done what you could to learn the topic in concern.


  • No leading from the front: Tutors are not expected to lead your project effort. They will not tell you how to do project tasks or when to do project tasks. You have to figure those out yourselves. But tutors will give you feedback on how you are doing (or have done) project tasks so that you can improve further.

Timing/venue:

  • Please refer to the Schedule pageSchedule page for further details on each tutorial.
  • You are expected to arrive on time. Punctuality is considered for participation marks.
  • You may leave the class 15 minutes before the hour if you have another class right after. There is no need to wait till the tutor dismisses you. However, inform the tutor (as a courtesy) before leaving if you leave before the class is dismissed.
  • Please make sure you vacate the table 5 minutes before the hour so that the next group can start on time.
  • In the past many students have suggested to increase the tutorial duration because 1 hour is barely enough to get through all weekly LOs. Increasing the tutorial time is not possible due to lack of venues and tutors. Instead, let's try to make the best of the one hour available by coming well prepared and starting on time.

Grading:

Tutorials are not graded. However, your conduct will be reviewed by team members and the tutor which will determine your participation marks.

[Exchange students only] Registering for tutorials:

  • Exchange students need to use the ORATUT system to register for the tutorials. You must have received the instructions from UG office on how/when to go about the registration process. If not, please talk to UG office. When we can see your appeal on ORATUT, we can allocate you to the tutorial slot.


Instructors

Dev Team:

This module is supported by a number of software tools developed by past students. Here are some of them whose work was directly relevant to the module for this semester:

  • Aditya Agarwal
  • Chng Zhi Xuan
  • Devamanyu Hazarica
  • Eugene Peh
  • Jia Zhixin
  • Lai Hoang Dung (Louis)
  • Metta Ong
  • Sidhdharth Aravindan
  • Tan Jun An
  • Tan Wang Leng
  • Teng Yong Hao


Textbooks

This module is supported by a customized online textbook Software Engineering for Self-Directed Learners (CS2103 edition), integrated into this module website. While it is in a dynamic Web page format, there is a way to save the main text as pdf files. Printer-friendly versions have been provided too.

Relevant: [Admin Using this Website → Saving as PDF files ]

 

Saving as PDF Files

  1. Use Chrome to load the page you want to save as pdf.

  2. Click on the Print option in Chrome’s menu.

  3. Set the destination to Save as PDF, then click Save to save a copy of the file in PDF format. For best results, use the settings indicated in the screenshot below.



Programming Language

The main language used in this module is Java. You should use Java for all programming activities, the project, and exam answers.

The module doesn’t “teach” Java. We assume you already know Java basics. We expect you to learn on your own any Java constructs not covered in your previous modules.

Java coding standard

This module follows the OSS-NUS Java coding standard.

In the project you are required to follow basic and intermediate guidelines (those marked as ⭐️ and ⭐️⭐️). In other programming activities in the module, we recommend (but not require) you to follow the coding standard.



Project


Project: Overview

The high-level learning outcome of the project (and to a large degree, the entire module):

Can contribute production quality SE work to a small/medium software project

Accordingly, the module project is structured to resemble an intermediate stage of a non-trivial real-life software project. In this project you will,

  1. conceptualize and implement enhancements to a given product, and,
  2. have it ready to be continued by future developers.


Project: The product

In this semester, we are going to enhance an AddressBook application.

This product is meant for users who can type fast, and prefer typing over mouse/voice commands. Therefore, Command Line Interface (CLI) is the primary mode of input.

Relevant: [Admin Project Contstraints → More info about the 'CLI app' requirement ]

 
  • Constraint-CLI: Command Line Interface is the primary mode of input. The GUI should be used primarily to give visual feedback to the user rather than to collect input. Some minimal use of mouse is OK (e.g. to click the minimize button), but the primary input should be command-driven.
    • Mouse actions should have keyboard alternatives.
    • Typing is preferred over key combinations. Design the app in a way that you can do stuff faster by typing compared to mouse actions or key combinations.
    • One-shot commands are preferred over multi-step commands. If you provide a multi-step command to help new users, you should also provide a one-shot equivalent for regular users.  Reason: We want the user to be able to accomplish tasks faster using CLI than a GUI; having to enter commands part-by-part will slow down the user.
    • While we don't prohibit GUI-only features, such features will be ignored during grading.



Project: Scope

project expectations

In general, each team is expected to take one of these two directions:
  • [Direction 1] Optimize AddressBook for a more specific target user group:

    An AddressBook,

    • for users in a specific profession  e.g. doctors, salesmen, teachers, etc.
    • based on the nature/scale of contacts  e.g. huge number of contacts (for HR admins, user group admins), mostly incomplete contacts, highly volatile contact details, contacts become inactive after a specific period (e.g. contract employees)
    • based on what users do with the contacts  e.g. organize group events, share info, do business, do analytics

  • [Direction 2] Morph AddressBook into a different product: Given that AddressBook is a generic app that manages a type of elements (i.e. contacts), you can use it as a starting point to create an app that manages something else.
    This is a high-risk high-reward option because morphing requires extra work but a morphed product may earn more marks than an optimized product of similar complexity.

    An app to manage,

    • Bookmarks of websites
    • Tasks/Schedule
    • Location info
    • Thing to memorize i.e. flash cards, trivia
    • Forum posts, news feeds, Social media feeds
    • Online projects or issue trackers that the user is interested in
    • Emails, possibly from different accounts
    • Multiple types of related things  e.g. Contacts and Tasks (if Tasks are allocated to Contacts)

For either direction, you need to define a target user profile and a value proposition:

  • Target user profile: Define a very specific target user profile.
    💡 We require you to narrow down the target user profile  as opposed to trying to make it as general as possible. Here is an example direction of narrowing down target user: anybody → teachers → university teachers → tech savvy university teachers → CS2103/T instructors.

    Be careful not to contradict given project constraints when defining the user profile  e.g. the target user should still prefer typing over mouse actions.

    It is expected that your product will be optimized for the chosen target users i.e., add features that are especially/only applicable for target users (to make the app especially attractive to them). w.r.t. the example above, there can be features that are applicable to CS2103/T instructors only, such as the ability to navigate to a student's project on GitHub
    💡 Your project will be graded based on how well the features match the target user profile and how well the features fit-together.

    • It is an opportunity to exercise your product design skills because optimizing the product to a very specific target user requires good product design skills.
    • It minimizes the overlap between features of different teams which can cause plagiarism issues. Furthermore, higher the number of other teams having the same features, less impressive your work becomes especially if others have done a better job of implementing that feature.

  • Value proposition: Define a clear value proposition (what problem does the product solve? how does it make the the user's life easier?) that matches the target user profile.

Individually, each student is expected to,

  1. Contribute one enhancement to the product
    Each enhancement should be stand-alone but,

    • it should be end-user visible and end-user testable.
    • should fit with the rest of the software (and the target user profile),
    • and should have the consent of the team members.
    1. Add a new feature
    2. Enhance an existing features in a major way e.g. make the command syntax more user friendly and closer to natural language
    3. A major redesign of the GUI e.g. make it work like a chat application (note: chat is a form of CLI)
    4. Integrate with online services e.g. Google contacts, Facebook, GitHub, etc.

    Here are some examples of different major enhancements and the grade the student is likely to earn for the relevant parts of the project grade.

    In the initial stages of the project you are recommended to add minor enhancements in order to get familiar with the project. These minor enhancements are unlikely to earn marks. You are advised not to spend a lot of effort on minor enhancements.

    Here is a non-exhaustive list of minor enhancements:

    1. Support different themes for the Look & Feel  dark, light, etc.
    2. Support more fields  e.g. Birthday
    3. Load a different page instead of the default Google search page  e.g. Google Maps page or Twitter page
    4. Sort items
    5. Multiple panels  e.g. an additional panel to show recently accessed items
    6. Marking some items as favorites
    7. Ability to search by labels
    8. Ability to specify colors for labels

  2. Recommended: contribute to all aspects of the project: e.g. write backend code, frontend code, test code, user documentation, and developer documentation. If you limit yourself to certain aspects only, you will lose marks allocated for the aspects you did not do.
    In particular, you are required to divide work based on features rather than components:

    • By the end of this project each team member is expected to have implemented an enhancement end-to-end, doing required changes in almost all components.  Reason: to encourage you to learn all components of the software, instead of limiting yourself to just one/few components.
    • Nevertheless, you are still expected to divide the components of the product among team members so that each team member is in charge of one or more components. While others will be modifying those components as necessary for the features they are implementing, your role as the in charge of a component is to guide others modifying that component (reason: you are supposed to be the most knowledgeable about that component) and protect that component from degrading  e.g., you can review others' changes to your component and suggest possible changes.
  3. Do a share of team-tasks: These are the tasks that someone in the team has to do. Marks allocated to team-tasks will be divided among team members based on how much each member contributed to those tasks.

    Here is a non-exhaustive list of team-tasks:

    1. Necessary general code enhancements e.g.,
      1. Work related to renaming the product
      2. Work related to changing the product icon
      3. Morphing the product into a different product
    2. Setting up the GitHub, Travis, AppVeyor, etc.
    3. Maintaining the issue tracker
    4. Release management
    5. Updating user/developer docs that are not specific to a feature  e.g. documenting the target user profile
    6. Incorporating more useful tools/libraries/frameworks into the product or the project workflow (e.g. automate more aspects of the project workflow using a GitHub plugin)

  4. Share roles and responsibilities of the project.

    Roles indicate aspects you are in charge of and responsible for. E.g., if you are in charge of documentation, you are the person who should allocate which parts of the documentation is to be done by who, ensure the document is in right format, ensure consistency etc.

    This is a non-exhaustive list; you may define additional roles.

    • Team lead: Responsible for overall project coordination.
    • Documentation (short for ‘in charge of documentation’): Responsible for the quality of various project documents.
    • Testing: Ensures the testing of the project is done properly and on time.
    • Code quality: Looks after code quality, ensures adherence to coding standards, etc.
    • Deliverables and deadlines: Ensure project deliverables are done on time and in the right format.
    • Integration: In charge of versioning of the code, maintaining the code repository, integrating various parts of the software to create a whole.
    • Scheduling and tracking: In charge of defining, assigning, and tracking project tasks.
    • [Tool ABC] expert: e.g. Intellij expert, Git expert, etc. Helps other team member with matters related to the specific tool.
    • In charge of[Component XYZ]: e.g. In charge of Model, UI, Storage, etc. If you are in charge of a component, you are expected to know that component well, and review changes done to that component in v1.3-v1.4.

    Please make sure each of the important roles are assigned to one person in the team. It is OK to have a 'backup' for each role, but for each aspect there should be one person who is unequivocally the person responsible for it.

  5. Write ~300-500 LoC of code, on average.

Relevant: [Admin Project Asessement → Expectation on testing ]

 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.

As a team, you are expected to work together to,

  1. Preserve product integrity: i.e.
    1. Enhancements added fit together to form a cohesive product.
    2. Documentation follows a consistent style and presents a cohesive picture to the reader.
    3. Final project demo presents a cohesive picture to the audience.
  2. Maintain product quality: i.e. prevent breaking other parts of the product as it evolves. Note that bugs local to a specific feature will be counted against the author of that feature. However, if a new enhancement breaks the entire product, the whole team will have to share the penalty.
  3. Manage the project smoothly: i.e. ensure workflow, code maintenance, integration, releases, etc. are done smoothly.


Project: Constraints

Your project should comply with the following constraints. Reason: to increase comparability among projects and to maximize applicability of module learning outcomes in the project.

  • Constraint-Morph: The final product should be a result of morphing the given code base. i.e. enhance and/or evolve the given code to arrive at the new software. However, you are allowed to replace all existing code with new code, as long as it is done incrementally. e.g. one feature/component at a time
    Reason: To ensure your code has a decent quality level from the start.

  • Constraint-Incremental: The product needs to be developed incrementally over the project duration. While it is fine to do less in some weeks and more in other weeks, a reasonably consistent delivery rate is expected. For example, it is not acceptable to do the entire project over the recess week and do almost nothing for the remainder of the semester. Reasons: 1. To simulate a real project where you have to work on a code base over a long period, possibly with breaks in the middle. 2. To learn how to deliver big features in small increments.

  • Constraint-CLI: Command Line Interface is the primary mode of input. The GUI should be used primarily to give visual feedback to the user rather than to collect input. Some minimal use of mouse is OK (e.g. to click the minimize button), but the primary input should be command-driven.
    • Mouse actions should have keyboard alternatives.
    • Typing is preferred over key combinations. Design the app in a way that you can do stuff faster by typing compared to mouse actions or key combinations.
    • One-shot commands are preferred over multi-step commands. If you provide a multi-step command to help new users, you should also provide a one-shot equivalent for regular users.  Reason: We want the user to be able to accomplish tasks faster using CLI than a GUI; having to enter commands part-by-part will slow down the user.
    • While we don't prohibit GUI-only features, such features will be ignored during grading.
  • Constraint-Human-Editable-File: The data should be stored locally and should be in a human editable text file.
    Reason: To allow advanced users to manipulate the data by editing the data file.

  • Constraint-OO: The software should follow the Object-oriented paradigm.
    Reason: For you to practice using OOP in a non-trivial project.

  • Constraint-No-DBMS: Do not use a DBMS to store data.
    Reason: Using a DBMS to store data will reduce the room to apply OOP techniques to manage data. It is true that most real world systems use a DBMS, but given the small size of this project, we need to optimize it for CS2103/T module learning outcomes; covering DBMS-related LOs will have to be left to database modules or level 3 project modules.

  • Constraint-Platform-Independent: The software should work on the Windows, Linux, and OS-X platforms. Even if you are unable to manually test the app on all three platforms, consciously avoid using OS-dependent libraries and OS-specific features.
    Reason: Peer testers should be able to use any of these platforms.

  • Constraint-No-Installer: The software should work without requiring an installer. Having an optional installer is OK as long as the portable (non-installed) version has all the critical functionality.
    Reason: We do not want to install all your projects on our testing machines when we test them for grading.

  • Constraint-Minimal-Network:

    • The software should not depend on your own remote server. Reason: Anyone should be able to test your app any time, even after the semester is over.
    • It is OK to use a reliable public API e.g., Google search but we recommend that you have a fallback mechanism (e.g., able to load data using a data file if the network is down). Reason: During the mass peer-testing session the network access can be intermittent due to high load. If your feature cannot be tested due to lack of Internet, that will have to be counted as a major bug, to be fair to those whose app is being tested and bugs found being penalized.
    • Also be cautioned that automated testing of such features will be harder, and public APIs can block your access if they mistake your automated tests as a bot attack.
  • Constraint-External-Software: The use of third-party frameworks/libraries is allowed but only if they,

    • are free, open-source, and have permissive license terms (E.g., trial version of libraries that require purchase after N days are not allowed).
    • do not require any installation by the user of your software.
    • do not violate other constraints.

    and is subjected to prior approval by the teaching team.
    Reason: We will not allow third-party software that can interfere with the learning objectives of the module.

    Please post in the forum your request to use a third-party libraries before you start using the library. Once a specific library has been approved for one team, other teams may use it without requesting permission again.
    Reason: The whole class should know which external software are used by others so that they can do the same if they wish to.



Forming Teams


[Picture: The team that was at the top of early Google]

When to form teams

  • CS2103T: Your team will be formed by CS2101 side in week 1.
  • CS2103: Your team will be formed in week 3 tutorial.

Team size: The default team size is five.

Team ID: This will be given to you after forming teams. It has the form TUTORIAL_ID-TEAM_NUMBER e.g, W14-2 means you are in tutorial W14 (i.e., Wed 1400-1500), team 2.

Relevant: [Admin Tutorials → Tutorial IDs ]

 

Our tutorials start on week 2 (even before CORS tutorial bidding is over), not in week 3 as other modules do. CS2103 (not CS2103T) students need to choose a temporary tutorial slot for week 2 tutorial. We'll inform you the procedure to do so in due course.

Our tutorial IDs are different from CORS. Format: W09 means Wednesday 0900 and so on.

Module Tutorial ID (ID in CORS) Time Venue Tutors (contact details)
CS2103 W10 (T01) Wed 1000 COM1-B103 (ALL)* TBD
CS2103T W12 (T01) Wed 1200 COM1-0210 (SR10) TBD
CS2103 W13 (T02) Wed 1300 COM1-0210 (SR10) TBD
CS2103T W14 (T02) Wed 1400 COM1-0210 (SR10) TBD
CS2103T W16 (T03) Wed 1600 COM1-B103 (ALL) TBD
CS2103T W17 (T04) Wed 1700 COM1-B103 (ALL) TBD
CS2103T T09 (T06) Thu 0900 COM1-0210 (SR10) TBD
CS2103 T10 (T04) Thu 1000 COM1-0210 (SR10) TBD
CS2103T T12 (T07) Thu 1200 COM1-0210 (SR10) TBD
CS2103 T13 (T06) Thu 1300 COM1-0210 (SR10) TBD
CS2103T T16 (T08) Thu 1600 COM1-0210 (SR10) TBD
CS2103T F10 (T10) Fri 1000 COM1-0210 (SR10) TBD
CS2103 F11 (T09) Fri 1100 COM1-0210 (SR10) TBD

*ALL: Active Learning Room

Team composition

We allow some freedom in choosing team members, subject to these constraints:

  • All team members should be in the same tutorial. Delay forming teams until your place in a tutorial is confirmed. We do not allow changing tutorials to team up with your preferred team mates.

  • Teams of single nationality are not allowed  Rationale: to train you to work in multicultural teams. However, we allow same nationality teams if the only language common among all team members is English. e.g. an all-Singaporean team that include both Chinese and Malay students.

  • No more than one exchange students per team Rationale: to increase interaction between exchange students and NUS students.

  • Gender balanced teams are encouraged. While all-male teams may be unavoidable at times (due to high male percentage in the cohort), all-female teams are highly discouraged.

  • Also note that we may modify teams when circumstances call for it. There is no avenue for you to object. Staying with your preferred team is not guaranteed.



Project: Timeline

To expedite your project implementation, you will be given some sample code (AddressBook-Level1 to AddressBook-Level4, shown as AB1 to AB4 in the diagram above). You can use AB1 to AB3 to ramp up your tech skills in preparation for the project. AB4 is to be used as the basis for your project.

AB4 is the version you will use as the starting point for your final project. Some of the work you do in AB1 to AB3 can be ported over to AB4 and can be used to claim credit in the final project.

Given below is the high-level timeline of the project.

Week Stage Activities
3 inception Decide on a overall project direction (user profile, problem addressed, optimize or morph?).
4 mid-v1.0 Decide on requirements (user stories, use cases, non-functional requirements).
5 v1.0 Conceptualize product and document it as a user guide(draft), draft a rough project plan.
6 mid-v1.1 Set up project repo, start moving UG and DG to the repo, attempt to do local-impact changes to the code base.
7 v1.1 Update UG and DG in the repo, attempt to do global-impact changes to the code base.
8 mid-v1.2 Adjust project schedule/rigor as needed, start proper milestone management.
9 v1.2 Move code towards v2.0 in small steps, start documenting design/implementation details in DG.
10 mid-v1.3 Continue to enhance features. Make code RepoSense-compatible. Try doing a proper release.
11 v1.3 Release as a jar file, release updated user guide, peer-test released products, verify code authorship.
12 mid-v1.4 Tweak as per peer-testing results, draft Project Portfolio Page, practice product demo.
13 v1.4 Final tweaks to docs/product, release product, demo product, evaluate peer projects.

More details of each stage is provided elsewhere is this website.



Project: inception [week 3]

Decide on a overall project direction (user profile, problem addressed, optimize or morph?).

It is not too early to set an overall direction for your project.

  • Set up a weekly project meeting time/venue with your team members

    We recommend at least one face-to-face project meeting per week. The project meeting time can be used to discuss project related things, but also, can be used as a time for team members to work on the project tasks individually (having all members in the same place will facilitate easier collaboration and more peer-learning).

  • Play around with AB4

    Download the latest released version (i.e., the jar file) of AB4 from its upstream repo and play around with it to familiarize with its current features.

  • Decide project direction, target user profile, and problem addressed

    Use your first project meeting to discuss with your team members and decide your project direction, target user profile, and the value proposition of the product, as described in [Admin Project Scope]

 

In general, each team is expected to take one of these two directions:

  • [Direction 1] Optimize AddressBook for a more specific target user group:

    An AddressBook,

    • for users in a specific profession  e.g. doctors, salesmen, teachers, etc.
    • based on the nature/scale of contacts  e.g. huge number of contacts (for HR admins, user group admins), mostly incomplete contacts, highly volatile contact details, contacts become inactive after a specific period (e.g. contract employees)
    • based on what users do with the contacts  e.g. organize group events, share info, do business, do analytics

  • [Direction 2] Morph AddressBook into a different product: Given that AddressBook is a generic app that manages a type of elements (i.e. contacts), you can use it as a starting point to create an app that manages something else.
    This is a high-risk high-reward option because morphing requires extra work but a morphed product may earn more marks than an optimized product of similar complexity.

    An app to manage,

    • Bookmarks of websites
    • Tasks/Schedule
    • Location info
    • Thing to memorize i.e. flash cards, trivia
    • Forum posts, news feeds, Social media feeds
    • Online projects or issue trackers that the user is interested in
    • Emails, possibly from different accounts
    • Multiple types of related things  e.g. Contacts and Tasks (if Tasks are allocated to Contacts)

For either direction, you need to define a target user profile and a value proposition:

  • Target user profile: Define a very specific target user profile.
    💡 We require you to narrow down the target user profile  as opposed to trying to make it as general as possible. Here is an example direction of narrowing down target user: anybody → teachers → university teachers → tech savvy university teachers → CS2103/T instructors.

    Be careful not to contradict given project constraints when defining the user profile  e.g. the target user should still prefer typing over mouse actions.

    It is expected that your product will be optimized for the chosen target users i.e., add features that are especially/only applicable for target users (to make the app especially attractive to them). w.r.t. the example above, there can be features that are applicable to CS2103/T instructors only, such as the ability to navigate to a student's project on GitHub
    💡 Your project will be graded based on how well the features match the target user profile and how well the features fit-together.

    • It is an opportunity to exercise your product design skills because optimizing the product to a very specific target user requires good product design skills.
    • It minimizes the overlap between features of different teams which can cause plagiarism issues. Furthermore, higher the number of other teams having the same features, less impressive your work becomes especially if others have done a better job of implementing that feature.

  • Value proposition: Define a clear value proposition (what problem does the product solve? how does it make the the user's life easier?) that matches the target user profile.


Project: mid-v1.0 [week 4]

Decide on requirements (user stories, use cases, non-functional requirements).

💡 Given below are some guidance on the recommended progress at this point of the project (i.e., at week 4, which is the midway point of the milestone v1.0)

This is a good time to analyze requirements with a view to conceptualizing the next version of the product (i.e. v2.0).

  • Step 1 : Brainstorm user stories

    Get together with your team members and brainstorm for user stories  for the v2.0 of the product. Note that in the module project you will deliver only up to v1.4 but here you should consider up to v2.0 (i.e. beyond the module).

    • It is ok to have more user stories than you can deliver in the project. Aim to create at least 30 user stories. Include all 'obvious' ones you can think of but also look for 'non obvious' ones that you think are likely to be missed by other teams.

    • Refer [Textbook Specifying Requirements → UserStories → Usage → (section) Tips] for tips on how to use user stories in this task.

    • You can write each user story in a piece of paper (e.g. yellow sticky note, index card, or just pieces of paper about the size of a playing card). Alternatively you can use an online tool (some examples given in [Textbook Specifying Requirements → UserStories → Usage → (panel) Tool Examples ]).

    • Note that you should not 'evaluate' the value of user stories while doing the above.  Reason: an important aspect of brainstorming is not judging the ideas generated.

 

Requirements → Gathering Requirements →

Brainstorming

Brainstorming: A group activity designed to generate a large number of diverse and creative ideas for the solution of a problem.

In a brainstorming session there are no "bad" ideas. The aim is to generate ideas; not to validate them. Brainstorming encourages you to "think outside the box" and put "crazy" ideas on the table without fear of rejection.

What is the key characteristic about brainstorming?

(b)

 

Requirements → Specifying Requirements → User Stories →

Introduction

User story: User stories are short, simple descriptions of a feature told from the perspective of the person who desires the new capability, usually a user or customer of the system. [Mike Cohn]

A common format for writing user stories is:

User story format: As a {user type/role} I can {function} so that {benefit}

Examples (from a Learning Management System):

  1. As a student, I can download files uploaded by lecturers, so that I can get my own copy of the files
  2. As a lecturer, I can create discussion forums, so that students can discuss things online
  3. As a tutor, I can print attendance sheets, so that I can take attendance during the class

We can write user stories on index cards or sticky notes, and arrange on walls or tables, to facilitate planning and discussion. Alternatively, we can use a software (e.g., GitHub Project Boards, Trello, Google Docs, ...) to manage user stories digitally.

[credit: https://www.flickr.com/photos/jakuza/2682466984/]

[credit: https://www.flickr.com/photos/jakuza/with/2726048607/]

[credit: https://commons.wikimedia.org/wiki/File:User_Story_Map_in_Action.png]

  • a. They are based on stories users tell about similar systems
  • b. They are written from the user/customer perspective
  • c. They are always written in some physical medium such as index cards or sticky notes
  • a. Reason: Despite the name, user stories are not related to 'stories' about the software.
  • b.
  • c. Reason: It is possible to use software to record user stories. When the team members are not co-located this may be the only option.

Critique the following user story taken from a software project to build an e-commerce website.

As a developer, I want to use Python to implement the software, so that we can resue existing Python modules.

Refer to the definition of a user story.

User story: User stories are short, simple descriptions of a feature told from the perspective of the person who desires the new capability, usually a user or customer of the system. [Mike Cohn]

This user story is not written from the perspective of the user/customer.

Bill wants you to build a Human Resource Management (HRM) system. He mentions that the system will help employees to view their own leave balance. What are the user stories you can extract from that statement?

Remember to follow the correct format when writing user stories.

User story format: As a {user type/role} I can {function} so that {benefit}

As an employee, I can view my leave balance, so that I can know how many leave days I have left.

Note: the {benefit} part may vary as it is not specifically mentioned in the question.

 
 

You can create issues for each of the user stories and use a GitHub Project Board to sort them into categories.

Example Project Board:

Example Issue to represent a user story:

A video on GitHub Project Boards:


 

Example Google Sheet for recording user stories:


 

Example Trello Board for recording user stories:


 

Given their lightweight nature, user stories are quite handy for recording requirements during early stages of requirements gathering.

💡 Here are some tips for using user stories for early stages of requirement gathering:

  • Define the target user:
    Decide your target user's profile (e.g. a student, office worker, programmer, sales person) and work patterns (e.g. Does he work in groups or alone? Does he share his computer with others?). A clear understanding of the target user will help when deciding the importance of a user story. You can even give this user a name.  e.g. Target user Jean is a university student studying in a non-IT field. She interacts with a lot of people due to her involvement in university clubs/societies. ...
  • Define the problem scope: Decide that exact problem you are going to solve for the target user.  e.g. Help Jean keep track of all her school contacts
  • Don't be too hasty to discard 'unusual' user stories:
    Those might make your product unique and stand out from the rest, at least for the target users.
  • Don't go into too much details:
    For example, consider this user story: As a user, I want to see a list of tasks that needs my attention most at the present time, so that I pay attention to them first.
    When discussing this user story, don't worry about what tasks should be considered needs my attention most at the present time. Those details can be worked out later.
  • Don't be biased by preconceived product ideas:
    When you are at the stage of identifying user needs, clear your mind of ideas you have about what your end product will look like.
  • Don't discuss implementation details or whether you are actually going to implement it:
    When gathering requirements, your decision is whether the user's need is important enough for you to want to fulfil it. Implementation details can be discussed later. If a user story turns out to be too difficult to implement later, you can always omit it from the implementation plan.

💡 Recommended: You can use GitHub issue tracker to manage user stories, but for that you need to set up your team's GitHub organization, project fork, and issue tracker first. Instructions for doing those steps are in the panel below.

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

Issue tracker setup

We recommend you configure the issue tracker of the main repo as follows:

  • Delete existing labels and add the following labels.
    💡 Issue type labels are useful from the beginning of the project. The other labels are needed only when you start implementing the features.

Issue type labels:

  • type.Epic : A big feature which can be broken down into smaller stories e.g. search
  • type.Story : A user story
  • type.Enhancement: An enhancement to an existing story
  • type.Task : Something that needs to be done, but not a story, bug, or an epic. e.g. Move testing code into a new folder)
  • type.Bug : A bug

Status labels:

  • status.Ongoing : The issue is currently being worked on. note: remove this label before closing an issue.

Priority labels:

  • priority.High : Must do
  • priority.Medium : Nice to have
  • priority.Low : Unlikely to do

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Create following milestones : v1.0v1.1v1.2v1.3v1.4,

  • You may configure other project settings as you wish. e.g. more labels, more milestones

Project Schedule Tracking

In general, use the issue tracker (Milestones, Issues, PRs, Tags, Releases, and Labels) for assigning, scheduling, and tracking all noteworthy project tasks, including user stories. Update the issue tracker regularly to reflect the current status of the project. You can also use GitHub's Projects feature to manage the project, but keep it linked to the issue tracker as much as you can.

Using Issues:

During the initial stages (latest by the start of v1.2):

  • Record each of the user stories you plan to deliver as an issue in the issue tracker. e.g. Title: As a user I can add a deadline
    Description: ... so that I can keep track of my deadlines

  • Assign the type.* and priority.* labels to those issues.

  • Formalize the project plan by assigning relevant issues to the corresponding milestone.

From milestone v1.2:

  • Define project tasks as issues. When you start implementing a user story (or a feature), break it down to smaller tasks if necessary. Define reasonable sized, standalone tasks. Create issues for each of those tasks so that they can be tracked.e.g.

    • A typical task should be able to done by one person, in a few hours.

      • Bad (reasons: not a one-person task, not small enough): Write the Developer Guide
      • Good: Update class diagram in the Developer Guide for v1.4
    • There is no need to break things into VERY small tasks. Keep them as big as possible, but they should be no bigger than what you are going to assign a single person to do within a week. eg.,

      • Bad:Implementing parser (reason: too big).
      • Good:Implementing parser support for adding of floating tasks
    • Do not track things taken for granted. e.g., push code to repo should not be a task to track. In the example given under the previous point, it is taken for granted that the owner will also (a) test the code and (b) push to the repo when it is ready. Those two need not be tracked as separate tasks.

    • Write a descriptive title for the issue. e.g. Add support for the 'undo' command to the parser

      • Omit redundant details. In some cases, the issue title is enough to describe the task. In that case, no need to repeat it in the issue description. There is no need for well-crafted and detailed descriptions for tasks. A minimal description is enough. Similarly, labels such as priority can be omitted if you think they don't help you.

  • Assign tasks (i.e., issues) to the corresponding team members using the assignees field. Normally, there should be some ongoing tasks and some pending tasks against each team member at any point.

  • Optionally, you can use status.ongoing label to indicate issues currently ongoing.

Using Milestones:

We recommend you do proper milestone management starting from v1.2. Given below are the conditions to satisfy for a milestone to be considered properly managed:

Planning a Milestone:

  • Issues assigned to the milestone, team members assigned to issues: Used GitHub milestones to indicate which issues are to be handled for which milestone by assigning issues to suitable milestones. Also make sure those issues are assigned to team members. Note that you can change the milestone plan along the way as necessary.

  • Deadline set for the milestones (in the GitHub milestone). Your internal milestones can be set earlier than the deadlines we have set, to give you a buffer.

Wrapping up a Milestone:

  • A working product tagged with the correct tag (e.g. v1.2) and is pushed to the main repo
    or a product release done on GitHub. A product release is optional for v1.2 but required from from v1.3. Click here to see an example release.

  • All tests passing on Travis for the version tagged/released.

  • Milestone updated to match the product i.e. all issues completed and PRs merged for the milestone should be assigned to the milestone. Incomplete issues/PRs should be moved to a future milestone.

  • Milestone closed.

  • If necessary, future milestones are revised based on what you experienced in the current milestone  e.g. if you could not finish all issues assigned to the current milestone, it is a sign that you overestimated how much you can do in a week, which means you might want to reduce the issues assigned to future milestones to match that observation.

As a user I can add a task by specifying a task description only, so that I can record tasks that need to be done ‘some day’. As a user I can find upcoming tasks, so that I can decide what needs to be done soon. As a user I can delete a task, so that I can get rid of tasks that I no longer care to track. As a new user I can view more information about a particular command, so that I can learn how to use various commands. As an advanced user I can use shorter versions of a command, so that type a command faster.


  • Step 2: Prioritize the user stories

    Suggested workflow:

    • Take one user story at a time and get team member opinions about it.

    • Based on the team consensus, put the story (i.e. the piece of paper) onto one of these three piles:

      • Must-Have : The product will be practically useless to the target user without this feature.
      • Nice-To-Have : The target user can benefit from this user story significantly but you are not certain if you'll have time to implement it.
      • Not-Useful : No significant benefit to the target user, or does not fit into the product vision.
    • If using physical paper to record user stories: After all stories have been put in the above three piles, you can make a record of which stories are in the three piles.

  • Step 3: Document requirements of the product

    Based on your user story categorization in the step above, given module requirements/constraints for the project, and the current state of the product, select which user stories you are likely to include in v2.0.

    Document the following items using a convenient format (e.g., a GoogleDoc). Do not spend time on formatting the content nicely; reason: these will be ported to the actual Developer Guide in your project repo later.
    💡 Some examples of these can be found in the AB4 Developer Guide.

    • Target user profile, value proposition, and user stories: Update the target user profile and value proposition to match the project direction you have selected. Give a list of the user stories (and update/delete existing ones, if applicable), including priorities. This can include user stories considered but will not be included in the final product.
    • Use cases: Give use cases (textual form) for a few representative user stories that need multiple steps to complete. e.g. Adding a tag to a person (assume the user needs to find the person first)
    • Non-functional requirements:
      Note: Many of the project constraints mentioned above are NFRs. You can add more. e.g. performance requirements, usability requirements, scalability requirements, etc.
    • Glossary: Define terms that are worth defining.
    • [Optional]Product survey: Explore a few similar/related products and describe your findings i.e. Pros, cons, (from the target user's point of view).
 

Requirements → Specifying Requirements → Use Cases →

Introduction

Use Case: A description of a set of sequences of actions, including variants, that a system performs to yield an observable result of value to an actor.[ 📖 : uml-user-guideThe Unified Modeling Language User Guide, 2e, G Booch, J Rumbaugh, and I Jacobson ]

Actor: An actor (in a use case) is a role played by a user. An actor can be a human or another system. Actors are not part of the system; they reside outside the system.

A use case describes an interaction between the user and the system for a specific functionality of the system.

  • System: ATM
  • Actor: Customer
  • Use Case: Check account balance
    1. User inserts an ATM card
    2. ATM prompts for PIN
    3. User enters PIN
    4. ATM prompts for withdrawal amount
    5. User enters the amount
    6. ATM ejects the ATM card and issues cash
    7. User collects the card and the cash.
  • System: A Learning Management System (LMS)
  • Actor: Student
  • Use Case: Upload file
    1. Student requests to upload file
    2. LMS requests for the file location
    3. Student specifies the file location
    4. LMS uploads the file

UML includes a diagram type called use case diagrams that can illustrate use cases of a system visually , providing a visual ‘table of contents’ of the use cases of a system. In the example below, note how use cases are shown as ovals and user roles relevant to each use case are shown as stick figures connected to the corresponding ovals.

Unified Modeling Language (UML) is a graphical notation to describe various aspects of a software system. UML is the brainchild of three software modeling specialists James Rumbaugh, Grady Booch and Ivar Jacobson (also known as the Three Amigos). Each of them has developed their own notation for modeling software systems before joining force to create a unified modeling language (hence, the term ‘Unified’ in UML). UML is currently the de facto modeling notation used in the software industry.

Use cases capture the functional requirements of a system.

 

Requirements → Requirements →

Non-Functional Requirements

There are two kinds of requirements:

  1. Functional requirements specify what the system should do.
  2. Non-functional requirements specify the constraints under which system is developed and operated.

Some examples of non-functional requirement categories:

  • Data requirements e.g. size, volatility, persistency etc.,
  • Environment requirements e.g. technical environment in which system would operate or need to be compatible with.
  • Accessibility, Capacity, Compliance with regulations, Documentation, Disaster recovery, Efficiency, Extensibility, Fault tolerance, Interoperability, Maintainability, Privacy, Portability, Quality, Reliability, Response time, Robustness, Scalability, Security, Stability, Testability, and more ...
  • Business/domain rules: e.g. the size of the minefield cannot be smaller than five.
  • Constraints: e.g. the system should be backward compatible with data produced by earlier versions of the system; system testers are available only during the last month of the project; the total project cost should not exceed $1.5 million.
  • Technical requirements: e.g. the system should work on both 32-bit and 64-bit environments.
  • Performance requirements: e.g. the system should respond within two seconds.
  • Quality requirements: e.g. the system should be usable by a novice who has never carried out an online purchase.
  • Process requirements: e.g. the project is expected to adhere to a schedule that delivers a feature set every one month.
  • Notes about project scope: e.g. the product is not required to handle the printing of reports.
  • Any other noteworthy points: e.g. the game should not use images deemed offensive to those injured in real mine clearing activities.

We may have to spend an extra effort in digging NFRs out as early as possible because,

  1. NFRs are easier to miss  e.g., stakeholders tend to think of functional requirements first
  2. sometimes NFRs are critical to the success of the software.  E.g. A web application that is too slow or that has low security is unlikely to succeed even if it has all the right functionality.

Given below are some requirements of TEAMMATES (an online peer evaluation system for education). Which one of these are non-functional requirements?

  • a. The response to any use action should become visible within 5 seconds.
  • b. The application admin should be able to view a log of user activities.
  • c. The source code should be open source.
  • d. A course should be able to have up to 2000 students.
  • e. As a student user, I can view details of my team members so that I can know who they are.
  • f. The user interface should be intuitive enough for users who are not IT-savvy.
  • g. The product is offered as a free online service.

(a)(c)(d)(f)(g)

Explanation: (b) are (e) are functions available for a specific user types. Therefore, they are functional requirements. (a), (c), (d), (f) and (g) are either constraints on functionality or constraints on how the project is done, both of which are considered non-functional requirements.

 

Requirements → Specifying Requirements → Glossary →

What

Glossary: A glossary serves to ensure that all stakeholders have a common understanding of the noteworthy terms, abbreviation, acronyms etc.

Here is a partial glossary from a variant of the Snakes and Ladders game:

  • Conditional square: A square that specifies a specific face value which a player has to throw before his/her piece can leave the square.
  • Normal square: a normal square does not have any conditions, snakes, or ladders in it.
 

Requirements → Gathering Requirements →

Product Surveys

Studying existing products can unearth shortcomings of existing solutions that can be addressed by a new product. Product manuals and other forms of technical documentation of an existing system can be a good way to learn about how the existing solutions work.

When developing a game for a mobile device, a look at a similar PC game can give insight into the kind of features and interactions the mobile game can offer.



Project: v1.0 [week 5]

Conceptualize product and document it as a user guide(draft), draft a rough project plan.

v1.0 Summary of Milestone

Here is a summary of items you need to deliver to reach v1.0 individual () and team () milestones. See sections below for more details of each item.

Milestone Minimum acceptable performance to consider as 'reached'
requirements documented a draft of v2.0 requirements in some form
[optional] product survey documented none
v2.0 conceptualized a draft of v2.0 user guide in some form
feature releases planned a rough feature release plan

Reaching individual and team milestones are considered for grading the project management component of your project grade (expand the panel below for more info).

The deadline for reaching a milestone is the midnight before your tutorial e.g., if your tutorial is on Wednesday, you need to reach the milestone by Tuesday midnight.

Relevant: [Admin Project Assessment → Project Management ]

 

A. Process:

Evaluates: How well you did in project management related aspects of the project, as an individual and as a team

Based on: Supervisor observations of project milestones and GitHub data.

Milestones need to be reached the midnight before of the tutorial for it to be counted as achieved. To get a good grade for this aspect, achieve at least 60% of the recommended milestone progress.

Other criteria:

  • Good use of GitHub milestones
  • Good use of GitHub release mechanism
  • Good version control, based on the repo
  • Reasonable attempt to use the forking workflow
  • Good task definition, assignment and tracking, based on the issue tracker
  • Good use of buffers (opposite: everything at the last minute)
  • Project done iteratively and incrementally (opposite: doing most of the work in one big burst)

B. Team-based tasks:

Evaluates: how much you contributed to common team-based tasksteam-based tasks

Based on: peer evaluations and tutor observations

Relevant: [Admin Project Scope → Examples of team tasks ]

 

Here is a non-exhaustive list of team-tasks:

  1. Necessary general code enhancements e.g.,
    1. Work related to renaming the product
    2. Work related to changing the product icon
    3. Morphing the product into a different product
  2. Setting up the GitHub, Travis, AppVeyor, etc.
  3. Maintaining the issue tracker
  4. Release management
  5. Updating user/developer docs that are not specific to a feature  e.g. documenting the target user profile
  6. Incorporating more useful tools/libraries/frameworks into the product or the project workflow (e.g. automate more aspects of the project workflow using a GitHub plugin)

v1.0 Documentation

  • Developer Guide: Have a draft of the requirements of your project, as described in mid-v1.0 progress guide.
 

Decide on requirements (user stories, use cases, non-functional requirements).

💡 Given below are some guidance on the recommended progress at this point of the project (i.e., at week 4, which is the midway point of the milestone v1.0)

This is a good time to analyze requirements with a view to conceptualizing the next version of the product (i.e. v2.0).

  • Step 1 : Brainstorm user stories

    Get together with your team members and brainstorm for user stories  for the v2.0 of the product. Note that in the module project you will deliver only up to v1.4 but here you should consider up to v2.0 (i.e. beyond the module).

    • It is ok to have more user stories than you can deliver in the project. Aim to create at least 30 user stories. Include all 'obvious' ones you can think of but also look for 'non obvious' ones that you think are likely to be missed by other teams.

    • Refer [Textbook Specifying Requirements → UserStories → Usage → (section) Tips] for tips on how to use user stories in this task.

    • You can write each user story in a piece of paper (e.g. yellow sticky note, index card, or just pieces of paper about the size of a playing card). Alternatively you can use an online tool (some examples given in [Textbook Specifying Requirements → UserStories → Usage → (panel) Tool Examples ]).

    • Note that you should not 'evaluate' the value of user stories while doing the above.  Reason: an important aspect of brainstorming is not judging the ideas generated.

 

Requirements → Gathering Requirements →

Brainstorming

Brainstorming: A group activity designed to generate a large number of diverse and creative ideas for the solution of a problem.

In a brainstorming session there are no "bad" ideas. The aim is to generate ideas; not to validate them. Brainstorming encourages you to "think outside the box" and put "crazy" ideas on the table without fear of rejection.

What is the key characteristic about brainstorming?

(b)

 

Requirements → Specifying Requirements → User Stories →

Introduction

User story: User stories are short, simple descriptions of a feature told from the perspective of the person who desires the new capability, usually a user or customer of the system. [Mike Cohn]

A common format for writing user stories is:

User story format: As a {user type/role} I can {function} so that {benefit}

Examples (from a Learning Management System):

  1. As a student, I can download files uploaded by lecturers, so that I can get my own copy of the files
  2. As a lecturer, I can create discussion forums, so that students can discuss things online
  3. As a tutor, I can print attendance sheets, so that I can take attendance during the class

We can write user stories on index cards or sticky notes, and arrange on walls or tables, to facilitate planning and discussion. Alternatively, we can use a software (e.g., GitHub Project Boards, Trello, Google Docs, ...) to manage user stories digitally.

[credit: https://www.flickr.com/photos/jakuza/2682466984/]

[credit: https://www.flickr.com/photos/jakuza/with/2726048607/]

[credit: https://commons.wikimedia.org/wiki/File:User_Story_Map_in_Action.png]

  • a. They are based on stories users tell about similar systems
  • b. They are written from the user/customer perspective
  • c. They are always written in some physical medium such as index cards or sticky notes
  • a. Reason: Despite the name, user stories are not related to 'stories' about the software.
  • b.
  • c. Reason: It is possible to use software to record user stories. When the team members are not co-located this may be the only option.

Critique the following user story taken from a software project to build an e-commerce website.

As a developer, I want to use Python to implement the software, so that we can resue existing Python modules.

Refer to the definition of a user story.

User story: User stories are short, simple descriptions of a feature told from the perspective of the person who desires the new capability, usually a user or customer of the system. [Mike Cohn]

This user story is not written from the perspective of the user/customer.

Bill wants you to build a Human Resource Management (HRM) system. He mentions that the system will help employees to view their own leave balance. What are the user stories you can extract from that statement?

Remember to follow the correct format when writing user stories.

User story format: As a {user type/role} I can {function} so that {benefit}

As an employee, I can view my leave balance, so that I can know how many leave days I have left.

Note: the {benefit} part may vary as it is not specifically mentioned in the question.

 
 

You can create issues for each of the user stories and use a GitHub Project Board to sort them into categories.

Example Project Board:

Example Issue to represent a user story:

A video on GitHub Project Boards:


 

Example Google Sheet for recording user stories:


 

Example Trello Board for recording user stories:


 

Given their lightweight nature, user stories are quite handy for recording requirements during early stages of requirements gathering.

💡 Here are some tips for using user stories for early stages of requirement gathering:

  • Define the target user:
    Decide your target user's profile (e.g. a student, office worker, programmer, sales person) and work patterns (e.g. Does he work in groups or alone? Does he share his computer with others?). A clear understanding of the target user will help when deciding the importance of a user story. You can even give this user a name.  e.g. Target user Jean is a university student studying in a non-IT field. She interacts with a lot of people due to her involvement in university clubs/societies. ...
  • Define the problem scope: Decide that exact problem you are going to solve for the target user.  e.g. Help Jean keep track of all her school contacts
  • Don't be too hasty to discard 'unusual' user stories:
    Those might make your product unique and stand out from the rest, at least for the target users.
  • Don't go into too much details:
    For example, consider this user story: As a user, I want to see a list of tasks that needs my attention most at the present time, so that I pay attention to them first.
    When discussing this user story, don't worry about what tasks should be considered needs my attention most at the present time. Those details can be worked out later.
  • Don't be biased by preconceived product ideas:
    When you are at the stage of identifying user needs, clear your mind of ideas you have about what your end product will look like.
  • Don't discuss implementation details or whether you are actually going to implement it:
    When gathering requirements, your decision is whether the user's need is important enough for you to want to fulfil it. Implementation details can be discussed later. If a user story turns out to be too difficult to implement later, you can always omit it from the implementation plan.

💡 Recommended: You can use GitHub issue tracker to manage user stories, but for that you need to set up your team's GitHub organization, project fork, and issue tracker first. Instructions for doing those steps are in the panel below.

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

Issue tracker setup

We recommend you configure the issue tracker of the main repo as follows:

  • Delete existing labels and add the following labels.
    💡 Issue type labels are useful from the beginning of the project. The other labels are needed only when you start implementing the features.

Issue type labels:

  • type.Epic : A big feature which can be broken down into smaller stories e.g. search
  • type.Story : A user story
  • type.Enhancement: An enhancement to an existing story
  • type.Task : Something that needs to be done, but not a story, bug, or an epic. e.g. Move testing code into a new folder)
  • type.Bug : A bug

Status labels:

  • status.Ongoing : The issue is currently being worked on. note: remove this label before closing an issue.

Priority labels:

  • priority.High : Must do
  • priority.Medium : Nice to have
  • priority.Low : Unlikely to do

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Create following milestones : v1.0v1.1v1.2v1.3v1.4,

  • You may configure other project settings as you wish. e.g. more labels, more milestones

Project Schedule Tracking

In general, use the issue tracker (Milestones, Issues, PRs, Tags, Releases, and Labels) for assigning, scheduling, and tracking all noteworthy project tasks, including user stories. Update the issue tracker regularly to reflect the current status of the project. You can also use GitHub's Projects feature to manage the project, but keep it linked to the issue tracker as much as you can.

Using Issues:

During the initial stages (latest by the start of v1.2):

  • Record each of the user stories you plan to deliver as an issue in the issue tracker. e.g. Title: As a user I can add a deadline
    Description: ... so that I can keep track of my deadlines

  • Assign the type.* and priority.* labels to those issues.

  • Formalize the project plan by assigning relevant issues to the corresponding milestone.

From milestone v1.2:

  • Define project tasks as issues. When you start implementing a user story (or a feature), break it down to smaller tasks if necessary. Define reasonable sized, standalone tasks. Create issues for each of those tasks so that they can be tracked.e.g.

    • A typical task should be able to done by one person, in a few hours.

      • Bad (reasons: not a one-person task, not small enough): Write the Developer Guide
      • Good: Update class diagram in the Developer Guide for v1.4
    • There is no need to break things into VERY small tasks. Keep them as big as possible, but they should be no bigger than what you are going to assign a single person to do within a week. eg.,

      • Bad:Implementing parser (reason: too big).
      • Good:Implementing parser support for adding of floating tasks
    • Do not track things taken for granted. e.g., push code to repo should not be a task to track. In the example given under the previous point, it is taken for granted that the owner will also (a) test the code and (b) push to the repo when it is ready. Those two need not be tracked as separate tasks.

    • Write a descriptive title for the issue. e.g. Add support for the 'undo' command to the parser

      • Omit redundant details. In some cases, the issue title is enough to describe the task. In that case, no need to repeat it in the issue description. There is no need for well-crafted and detailed descriptions for tasks. A minimal description is enough. Similarly, labels such as priority can be omitted if you think they don't help you.

  • Assign tasks (i.e., issues) to the corresponding team members using the assignees field. Normally, there should be some ongoing tasks and some pending tasks against each team member at any point.

  • Optionally, you can use status.ongoing label to indicate issues currently ongoing.

Using Milestones:

We recommend you do proper milestone management starting from v1.2. Given below are the conditions to satisfy for a milestone to be considered properly managed:

Planning a Milestone:

  • Issues assigned to the milestone, team members assigned to issues: Used GitHub milestones to indicate which issues are to be handled for which milestone by assigning issues to suitable milestones. Also make sure those issues are assigned to team members. Note that you can change the milestone plan along the way as necessary.

  • Deadline set for the milestones (in the GitHub milestone). Your internal milestones can be set earlier than the deadlines we have set, to give you a buffer.

Wrapping up a Milestone:

  • A working product tagged with the correct tag (e.g. v1.2) and is pushed to the main repo
    or a product release done on GitHub. A product release is optional for v1.2 but required from from v1.3. Click here to see an example release.

  • All tests passing on Travis for the version tagged/released.

  • Milestone updated to match the product i.e. all issues completed and PRs merged for the milestone should be assigned to the milestone. Incomplete issues/PRs should be moved to a future milestone.

  • Milestone closed.

  • If necessary, future milestones are revised based on what you experienced in the current milestone  e.g. if you could not finish all issues assigned to the current milestone, it is a sign that you overestimated how much you can do in a week, which means you might want to reduce the issues assigned to future milestones to match that observation.

As a user I can add a task by specifying a task description only, so that I can record tasks that need to be done ‘some day’. As a user I can find upcoming tasks, so that I can decide what needs to be done soon. As a user I can delete a task, so that I can get rid of tasks that I no longer care to track. As a new user I can view more information about a particular command, so that I can learn how to use various commands. As an advanced user I can use shorter versions of a command, so that type a command faster.


  • Step 2: Prioritize the user stories

    Suggested workflow:

    • Take one user story at a time and get team member opinions about it.

    • Based on the team consensus, put the story (i.e. the piece of paper) onto one of these three piles:

      • Must-Have : The product will be practically useless to the target user without this feature.
      • Nice-To-Have : The target user can benefit from this user story significantly but you are not certain if you'll have time to implement it.
      • Not-Useful : No significant benefit to the target user, or does not fit into the product vision.
    • If using physical paper to record user stories: After all stories have been put in the above three piles, you can make a record of which stories are in the three piles.

  • Step 3: Document requirements of the product

    Based on your user story categorization in the step above, given module requirements/constraints for the project, and the current state of the product, select which user stories you are likely to include in v2.0.

    Document the following items using a convenient format (e.g., a GoogleDoc). Do not spend time on formatting the content nicely; reason: these will be ported to the actual Developer Guide in your project repo later.
    💡 Some examples of these can be found in the AB4 Developer Guide.

    • Target user profile, value proposition, and user stories: Update the target user profile and value proposition to match the project direction you have selected. Give a list of the user stories (and update/delete existing ones, if applicable), including priorities. This can include user stories considered but will not be included in the final product.
    • Use cases: Give use cases (textual form) for a few representative user stories that need multiple steps to complete. e.g. Adding a tag to a person (assume the user needs to find the person first)
    • Non-functional requirements:
      Note: Many of the project constraints mentioned above are NFRs. You can add more. e.g. performance requirements, usability requirements, scalability requirements, etc.
    • Glossary: Define terms that are worth defining.
    • [Optional]Product survey: Explore a few similar/related products and describe your findings i.e. Pros, cons, (from the target user's point of view).
 

Requirements → Specifying Requirements → Use Cases →

Introduction

Use Case: A description of a set of sequences of actions, including variants, that a system performs to yield an observable result of value to an actor.[ 📖 : uml-user-guideThe Unified Modeling Language User Guide, 2e, G Booch, J Rumbaugh, and I Jacobson ]

Actor: An actor (in a use case) is a role played by a user. An actor can be a human or another system. Actors are not part of the system; they reside outside the system.

A use case describes an interaction between the user and the system for a specific functionality of the system.

  • System: ATM
  • Actor: Customer
  • Use Case: Check account balance
    1. User inserts an ATM card
    2. ATM prompts for PIN
    3. User enters PIN
    4. ATM prompts for withdrawal amount
    5. User enters the amount
    6. ATM ejects the ATM card and issues cash
    7. User collects the card and the cash.
  • System: A Learning Management System (LMS)
  • Actor: Student
  • Use Case: Upload file
    1. Student requests to upload file
    2. LMS requests for the file location
    3. Student specifies the file location
    4. LMS uploads the file

UML includes a diagram type called use case diagrams that can illustrate use cases of a system visually , providing a visual ‘table of contents’ of the use cases of a system. In the example below, note how use cases are shown as ovals and user roles relevant to each use case are shown as stick figures connected to the corresponding ovals.

Unified Modeling Language (UML) is a graphical notation to describe various aspects of a software system. UML is the brainchild of three software modeling specialists James Rumbaugh, Grady Booch and Ivar Jacobson (also known as the Three Amigos). Each of them has developed their own notation for modeling software systems before joining force to create a unified modeling language (hence, the term ‘Unified’ in UML). UML is currently the de facto modeling notation used in the software industry.

Use cases capture the functional requirements of a system.

 

Requirements → Requirements →

Non-Functional Requirements

There are two kinds of requirements:

  1. Functional requirements specify what the system should do.
  2. Non-functional requirements specify the constraints under which system is developed and operated.

Some examples of non-functional requirement categories:

  • Data requirements e.g. size, volatility, persistency etc.,
  • Environment requirements e.g. technical environment in which system would operate or need to be compatible with.
  • Accessibility, Capacity, Compliance with regulations, Documentation, Disaster recovery, Efficiency, Extensibility, Fault tolerance, Interoperability, Maintainability, Privacy, Portability, Quality, Reliability, Response time, Robustness, Scalability, Security, Stability, Testability, and more ...
  • Business/domain rules: e.g. the size of the minefield cannot be smaller than five.
  • Constraints: e.g. the system should be backward compatible with data produced by earlier versions of the system; system testers are available only during the last month of the project; the total project cost should not exceed $1.5 million.
  • Technical requirements: e.g. the system should work on both 32-bit and 64-bit environments.
  • Performance requirements: e.g. the system should respond within two seconds.
  • Quality requirements: e.g. the system should be usable by a novice who has never carried out an online purchase.
  • Process requirements: e.g. the project is expected to adhere to a schedule that delivers a feature set every one month.
  • Notes about project scope: e.g. the product is not required to handle the printing of reports.
  • Any other noteworthy points: e.g. the game should not use images deemed offensive to those injured in real mine clearing activities.

We may have to spend an extra effort in digging NFRs out as early as possible because,

  1. NFRs are easier to miss  e.g., stakeholders tend to think of functional requirements first
  2. sometimes NFRs are critical to the success of the software.  E.g. A web application that is too slow or that has low security is unlikely to succeed even if it has all the right functionality.

Given below are some requirements of TEAMMATES (an online peer evaluation system for education). Which one of these are non-functional requirements?

  • a. The response to any use action should become visible within 5 seconds.
  • b. The application admin should be able to view a log of user activities.
  • c. The source code should be open source.
  • d. A course should be able to have up to 2000 students.
  • e. As a student user, I can view details of my team members so that I can know who they are.
  • f. The user interface should be intuitive enough for users who are not IT-savvy.
  • g. The product is offered as a free online service.

(a)(c)(d)(f)(g)

Explanation: (b) are (e) are functions available for a specific user types. Therefore, they are functional requirements. (a), (c), (d), (f) and (g) are either constraints on functionality or constraints on how the project is done, both of which are considered non-functional requirements.

 

Requirements → Specifying Requirements → Glossary →

What

Glossary: A glossary serves to ensure that all stakeholders have a common understanding of the noteworthy terms, abbreviation, acronyms etc.

Here is a partial glossary from a variant of the Snakes and Ladders game:

  • Conditional square: A square that specifies a specific face value which a player has to throw before his/her piece can leave the square.
  • Normal square: a normal square does not have any conditions, snakes, or ladders in it.
 

Requirements → Gathering Requirements →

Product Surveys

Studying existing products can unearth shortcomings of existing solutions that can be addressed by a new product. Product manuals and other forms of technical documentation of an existing system can be a good way to learn about how the existing solutions work.

When developing a game for a mobile device, a look at a similar PC game can give insight into the kind of features and interactions the mobile game can offer.

  • User Guide:
    Draft a user guide in a convenient medium (e.g., a GoogleDoc) to describe what the product would be like when it is at v2.0.

    • We recommend that you follow the existing AB4 User Guide in terms of structure and format.
    • As this is a very rough draft and the final version will be in a different format altogether (i.e., in asciidoc format), don't waste time in formatting, copy editing etc. It is fine as long as the tutor can get a rough idea of the features from this draft. You can also do just the 'Features' section and omit the other parts.
    • Do try to come up with concrete command syntax for feature that you would implement (at least for those that you will implement by v1.4).
    • Consider including some UI mock-ups too (they can be hand-drawn or created using a tool such as PowerPoint or Balsamiq).

    💡 It is highly recommended that you divide documentation work (in the User Guide and the Developer Guide) among team members based on enhancements/features each person would be adding  e.g., If you are the person planing to add a feature X, you should be the person to describe the feature X in the User Guide and in the Developer Guide. For features that are not planned to be implemented by v1.4, you can divide them based on who will be implementing them if the project were to continue until v2.0 (hypothetically).

    Reason: In the final project evaluation your documentation skills will be graded based on sections of the User/Developer Guide you have written.

Suggested length: Follow the existing user guide and developer guides in terms of the level of details.

Submission : Save your draft a single pdf file, name it {Your Team ID}.pdf e.g., W09-3.pdf and upload to IVLE.

v1.0 Project Management

  • After the v2.0 is conceptualized, decide which features each member will do by v1.4. We realize that it will be hard for you to estimate the effort required for each feature as you are not familiar with the code base. Nevertheless, come up with a project plan as per your best estimate; this plan can be revised at later stages. It is better to start with some plan rather than no plan at all. If in doubt, choose to do less than more; we don't expect you to deliver a lot of big features.

  • Divide each of those features into three increments, to be released at v1.1, v1.2, v1.3 (v1.4 omitted deliberately as a buffer). Ideally, each increment should deliver a end-user testable enhancement.

  • Document the above two items somewhere e.g., in a Google doc/sheet. An example is given below:

    * Jake Woo: Profile photo feature
      * v1.1: show a place holder for photo, showing a generic default image
      * v1.2: can specify photo location if it is in local hard disk,
              show photo from local hard disk
      * v1.3: auto-copy the photo to app folder, support using online photo
              as profile pic, stylize photo e.g., round frame
    

Submission : Include in the pdf file you upload to IVLE.



Project: mid-v1.1 [week 6]

Set up project repo, start moving UG and DG to the repo, attempt to do local-impact changes to the code base.

Project Management:

Set up the team org and the team repo as explained below:

Relevant: [Admin Appendix E(extract): Organization setup ]

 

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Relevant: [Admin Appendix E(extract): Repo setup ]

 

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

When updating code in the repo, follow the workflow explained below:

Relevant: [Admin Appendix E(extract): Workflow ]

 

Workflow

Before you do any coding for the project,

  • Ensure you have set the Git username correctly (as explained in Appendix E) in all Computers you use for coding.
  • Read our reuse policy (in Admin: Appendix B), in particular, how to give credit when you reuse code from the Internet or classmates:
 

Setting Git Username to Match GitHub Username

We use various tools to analyze your code. For us to be able to identify your commits, you should use the GitHub username as your Git username as well. If there is a mismatch, or if you use multiple user names for Git, our tools might miss some of your work and as a result you might not get credit for some of your work.

In each Computer you use for coding, after installing Git, you should set the Git username as follows.

  1. Open a command window that can run Git commands (e.g., Git bash window)
  2. Run the command git config --global user.name YOUR_GITHUB_USERNAME
    e.g., git config --global user.name JohnDoe

More info about setting Git username is here.

 

Policy on reuse

Reuse is encouraged. However, note that reuse has its own costs (such as the learning curve, additional complexity, usage restrictions, and unknown bugs). Furthermore, you will not be given credit for work done by others. Rather, you will be given credit for using work done by others.

  • You are allowed to reuse work from your classmates, subject to following conditions:
    • The work has been published by us or the authors.
    • You clearly give credit to the original author(s).
  • You are allowed to reuse work from external sources, subject to following conditions:
    • The work comes from a source of 'good standing' (such as an established open source project). This means you cannot reuse code written by an outside 'friend'.
    • You clearly give credit to the original author. Acknowledge use of third party resources clearly e.g. in the welcome message, splash screen (if any) or under the 'about' menu. If you are open about reuse, you are less likely to get into trouble if you unintentionally reused something copyrighted.
    • You do not violate the license under which the work has been released. Please  do not use 3rd-party images/audio in your software unless they have been specifically released to be used freely. Just because you found it in the Internet does not mean it is free for reuse.
    • Always get permission from us before you reuse third-party libraries. Please post your 'request to use 3rd party library' in our forum. That way, the whole class get to see what libraries are being used by others.

Giving credit for reused work

Given below are how to give credit for things you reuse from elsewhere. These requirements are specific to this module  i.e., not applicable outside the module (outside the module you should follow the rules specified by your employer and the license of the reused work)

If you used a third party library:

  • Mention in the README.adoc (under the Acknowledgements section)
  • mention in the Project Portfolio Page if the library has a significant relevance to the features you implemented

If you reused code snippets found on the Internet  e.g. from StackOverflow answers or
referred code in another software or
referred project code by current/past student:

  • If you read the code to understand the approach and implemented it yourself, mention it as a comment
    Example:
    //Solution below adapted from https://stackoverflow.com/a/16252290
    {Your implmentation of the reused solution here ...}
    
  • If you copy-pasted a non-trivial code block (possibly with minor modifications  renaming, layout changes, changes to comments, etc.), also mark the code block as reused code (using @@author tags)
    Format:
    //@@author {yourGithubUsername}-reused
    //{Info about the source...}
    
    {Reused code (possibly with minor modifications) here ...}
    
    //@@author
    
    Example of reusing a code snippet (with minor modifications):
    persons = getList()
    //@@author johndoe-reused
    //Reused from https://stackoverflow.com/a/34646172 with minor modifications
    Collections.sort(persons, new Comparator<CustomData>() {
        @Override
        public int compare(CustomData lhs, CustomData rhs) {
            return lhs.customInt > rhs.customInt ? -1 : (lhs.customInt < rhs.customInt) ? 1 : 0;
        }
    });
    //@@author
    return persons;
    
 

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Follow the forking workflow in your project up to v1.1. In particular,

  • Get team members to review PRs. A workflow without PR reviews is a risky workflow.
  • Do not merge PRs failing CI. After setting up Travis, the CI status of a PR is reported at the bottom of the PR page. The screenshot below shows the status of a PR that is passing all CI checks.

    If there is a failure, you can click on the Details link in corresponding line to find out more about the failure. Once you figure out the cause of the failure, push the a fix to the PR.
  • After setting up Netlify, you can use Netlify PR Preview to preview changes to documentation files, if the PR contains updates to documentation. To see the preview, click on the Details link in front of the Netlify status reported (refer screenshot above).

After completing v1.1, you can adjust process rigor to suit your team's pace, as explained below.

  • Reduce automated tests have benefits, but they can be a pain to write/maintain; GUI tests are especially hard to maintain because their behavior can sometimes depend on things such as the OS, resolution etc.
    It is OK to get rid of some of the troublesome tests and rely more on manual testing instead. The less automated tests you have, the higher the risk of regressions; but it may be an acceptable trade-off under the circumstances if tests are slowing you down too much.
    There is no direct penalty for removing GUI tests. Also note our expectation on test code.

  • Reduce automated checks: You can also reduce the rigor of checkstyle checks to expedite PR processing.

  • Switch to a lighter workflow: While forking workflow is the safest, it is also rather heavy. You an switch to a simpler workflow if the forking workflow is slowing you down. Refer the textbook to find more about alternative workflows: branching workflow, centralized workflow. However, we still recommend that you use PR reviews, at least for PRs affecting others' features.

You can also increase the rigor/safety of your workflow in the following ways:

  • Use GitHub's Protected Branches feature to protect your master branch against rogue PRs.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.
 

Project Management → Revision Control →

Forking Flow

In the forking workflow, the 'official' version of the software is kept in a remote repo designated as the 'main repo'. All team members fork the main repo create pull requests from their fork to the main repo.

To illustrate how the workflow goes, let’s assume Jean wants to fix a bug in the code. Here are the steps:

  1. Jean creates a separate branch in her local repo and fixes the bug in that branch.
  2. Jean pushes the branch to her fork.
  3. Jean creates a pull request from that branch in her fork to the main repo.
  4. Other members review Jean’s pull request.
  5. If reviewers suggested any changes, Jean updates the PR accordingly.
  6. When reviewers are satisfied with the PR, one of the members (usually the team lead or a designated 'maintainer' of the main repo) merges the PR, which brings Jean’s code to the main repo.
  7. Other members, realizing there is new code in the upstream repo, sync their forks with the new upstream repo (i.e. the main repo). This is done by pulling the new code to their own local repo and pushing the updated code to their own fork.

Documentation:

Recommended procedure for updating docs:

  1. Divide among yourselves who will update which parts of the document(s).
  2. Update the team repo by following the workflow mentioned above.

Update the following pages in your project repo:

  • About Us page: This page is used for module admin purposes. Please follow the format closely or else our scripts will not be able to give credit for your work.
    • Replace info of SE-EDU developers with info of your team, including a suitable photo as described here.
    • Including the name/photo of the supervisor/lecturer is optional.
    • The photo of a team member should be doc/images/githbub_username_in_lower_case.png e.g. docs/images/damithc.png. If you photo is in jpg format, name the file as .png anyway.
    • Indicate the different roles played and responsibilities held by each team member. You can reassign these roles and responsibilities (as explained in Admin Project Scope) later in the project, if necessary.
 
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

 

Roles indicate aspects you are in charge of and responsible for. E.g., if you are in charge of documentation, you are the person who should allocate which parts of the documentation is to be done by who, ensure the document is in right format, ensure consistency etc.

This is a non-exhaustive list; you may define additional roles.

  • Team lead: Responsible for overall project coordination.
  • Documentation (short for ‘in charge of documentation’): Responsible for the quality of various project documents.
  • Testing: Ensures the testing of the project is done properly and on time.
  • Code quality: Looks after code quality, ensures adherence to coding standards, etc.
  • Deliverables and deadlines: Ensure project deliverables are done on time and in the right format.
  • Integration: In charge of versioning of the code, maintaining the code repository, integrating various parts of the software to create a whole.
  • Scheduling and tracking: In charge of defining, assigning, and tracking project tasks.
  • [Tool ABC] expert: e.g. Intellij expert, Git expert, etc. Helps other team member with matters related to the specific tool.
  • In charge of[Component XYZ]: e.g. In charge of Model, UI, Storage, etc. If you are in charge of a component, you are expected to know that component well, and review changes done to that component in v1.3-v1.4.

Please make sure each of the important roles are assigned to one person in the team. It is OK to have a 'backup' for each role, but for each aspect there should be one person who is unequivocally the person responsible for it.

  • Contact Us Page: Update to match your product.

  • README.adoc page: Update it to match your project.

    • Add a UI mockup of your intended final product.
      Note that the image of the UI should be docs/images/Ui.png so that it can be downloaded by our scripts. Limit the file to contain one screenshot/mockup only and ensure the new image is roughly the same height x width proportions as the original one. Reason: when we compile these images from all teams into one page (example), yours should not look out of place.

    • The original README.adoc file (which doubles as the landing page of your project website) is written to read like the introduction to an SE learning/teaching resource. You should restructure this page to look like the home page of a real product (not a school project) targeting real users  e.g. remove references to addressbook-level3, Learning Outcomes etc. mention target users, add a marketing blurb etc. On a related note, also remove Learning Outcomes link and related pages.

    • Update the link of the Travis build status badge (Build Status) so that it reflects the build status of your team repo.
      For the other badges,

      • either set up the respective tool for your project (AB-4 Developer Guide has instructions on how to set up AppVeyor and Coveralls) and update the badges accordingly,
      • or remove the badge.
    • Acknowledge the original source of the code i.e. AddressBook-Level4 project created by SE-EDU initiative at https://github.com/se-edu/

  • User Guide: Start moving the content from your User Guide (draft created in previous weeks) into the User Guide page in your repository. If a feature is not implemented, mark it as 'Coming in v2.0' (example).

  • Developer Guide: Similar to the User Guide, start moving the content from your Developer Guide (draft created in previous weeks) into the Developer Guide page in your team repository.

Product:

  • Each member can attempt to do a local-impact change to the code base.

    Objective: To familiarize yourself with at least one components of the product.

    Description: Divide the components among yourselves. Each member can do some small enhancements to their component(s) to learn the code of that component. Some suggested enhancements are given in the AddressBook-Level4 developer guide.

    Submission: Create PRs from your own fork to your team repo. Get it merged by following your team's workflow.



Project: v1.1 [week 7]

Update UG and DG in the repo, attempt to do global-impact changes to the code base.

Milestone progress is graded. Be reminded that reaching individual and team milestones are considered for grading the project management component of your project grade.

Most aspects project progress are tracked using automated scripts. lease follow our instructions closely or else the script will not be able to detect your progress. We prefer not to spend admin resources processing requests for partial credit for work that did not follow the instructions precisely, unless the progress was not detected due to a bug in the script.

Milestone requirements are cumulative. The recommended progress for the mid-milestone is an implicit requirement for the actual milestone unless a milestone requirement overrides a mid-milestone requirement e.g., mid-milestone requires a document to be in a temp format while the actual milestone requires it to be in the proper format. Similarly, a requirement for milestone n is also an implicit requirement for milestone n+1 unless n+1 overrides the n requirement. This means if you miss some requirement at milestone n, you should try to achieve it before milestone n+1 or else it could be noted again as a 'missed requirement' at milestone n+1.

v1.1 Summary of Milestone

Milestone Minimum acceptable performance to consider as 'reached'
Team org/repo set up as stated in mid-v1.1 (i.e., team PR created, auto-publishing of docs set up)
Some code enhancements done created PRs to do local/global changes
Photo uploaded a photo complying to our guidelines is in the master branch of your team repo
Project docs updated updated docs are merged to the master branch
Milestone wrapped up a commit in the master branch tagged as v1.1
 

Set up project repo, start moving UG and DG to the repo, attempt to do local-impact changes to the code base.

Project Management:

Set up the team org and the team repo as explained below:

Relevant: [Admin Appendix E(extract): Organization setup ]

 

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Relevant: [Admin Appendix E(extract): Repo setup ]

 

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

When updating code in the repo, follow the workflow explained below:

Relevant: [Admin Appendix E(extract): Workflow ]

 

Workflow

Before you do any coding for the project,

  • Ensure you have set the Git username correctly (as explained in Appendix E) in all Computers you use for coding.
  • Read our reuse policy (in Admin: Appendix B), in particular, how to give credit when you reuse code from the Internet or classmates:
 

Setting Git Username to Match GitHub Username

We use various tools to analyze your code. For us to be able to identify your commits, you should use the GitHub username as your Git username as well. If there is a mismatch, or if you use multiple user names for Git, our tools might miss some of your work and as a result you might not get credit for some of your work.

In each Computer you use for coding, after installing Git, you should set the Git username as follows.

  1. Open a command window that can run Git commands (e.g., Git bash window)
  2. Run the command git config --global user.name YOUR_GITHUB_USERNAME
    e.g., git config --global user.name JohnDoe

More info about setting Git username is here.

 

Policy on reuse

Reuse is encouraged. However, note that reuse has its own costs (such as the learning curve, additional complexity, usage restrictions, and unknown bugs). Furthermore, you will not be given credit for work done by others. Rather, you will be given credit for using work done by others.

  • You are allowed to reuse work from your classmates, subject to following conditions:
    • The work has been published by us or the authors.
    • You clearly give credit to the original author(s).
  • You are allowed to reuse work from external sources, subject to following conditions:
    • The work comes from a source of 'good standing' (such as an established open source project). This means you cannot reuse code written by an outside 'friend'.
    • You clearly give credit to the original author. Acknowledge use of third party resources clearly e.g. in the welcome message, splash screen (if any) or under the 'about' menu. If you are open about reuse, you are less likely to get into trouble if you unintentionally reused something copyrighted.
    • You do not violate the license under which the work has been released. Please  do not use 3rd-party images/audio in your software unless they have been specifically released to be used freely. Just because you found it in the Internet does not mean it is free for reuse.
    • Always get permission from us before you reuse third-party libraries. Please post your 'request to use 3rd party library' in our forum. That way, the whole class get to see what libraries are being used by others.

Giving credit for reused work

Given below are how to give credit for things you reuse from elsewhere. These requirements are specific to this module  i.e., not applicable outside the module (outside the module you should follow the rules specified by your employer and the license of the reused work)

If you used a third party library:

  • Mention in the README.adoc (under the Acknowledgements section)
  • mention in the Project Portfolio Page if the library has a significant relevance to the features you implemented

If you reused code snippets found on the Internet  e.g. from StackOverflow answers or
referred code in another software or
referred project code by current/past student:

  • If you read the code to understand the approach and implemented it yourself, mention it as a comment
    Example:
    //Solution below adapted from https://stackoverflow.com/a/16252290
    {Your implmentation of the reused solution here ...}
    
  • If you copy-pasted a non-trivial code block (possibly with minor modifications  renaming, layout changes, changes to comments, etc.), also mark the code block as reused code (using @@author tags)
    Format:
    //@@author {yourGithubUsername}-reused
    //{Info about the source...}
    
    {Reused code (possibly with minor modifications) here ...}
    
    //@@author
    
    Example of reusing a code snippet (with minor modifications):
    persons = getList()
    //@@author johndoe-reused
    //Reused from https://stackoverflow.com/a/34646172 with minor modifications
    Collections.sort(persons, new Comparator<CustomData>() {
        @Override
        public int compare(CustomData lhs, CustomData rhs) {
            return lhs.customInt > rhs.customInt ? -1 : (lhs.customInt < rhs.customInt) ? 1 : 0;
        }
    });
    //@@author
    return persons;
    
 

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Follow the forking workflow in your project up to v1.1. In particular,

  • Get team members to review PRs. A workflow without PR reviews is a risky workflow.
  • Do not merge PRs failing CI. After setting up Travis, the CI status of a PR is reported at the bottom of the PR page. The screenshot below shows the status of a PR that is passing all CI checks.

    If there is a failure, you can click on the Details link in corresponding line to find out more about the failure. Once you figure out the cause of the failure, push the a fix to the PR.
  • After setting up Netlify, you can use Netlify PR Preview to preview changes to documentation files, if the PR contains updates to documentation. To see the preview, click on the Details link in front of the Netlify status reported (refer screenshot above).

After completing v1.1, you can adjust process rigor to suit your team's pace, as explained below.

  • Reduce automated tests have benefits, but they can be a pain to write/maintain; GUI tests are especially hard to maintain because their behavior can sometimes depend on things such as the OS, resolution etc.
    It is OK to get rid of some of the troublesome tests and rely more on manual testing instead. The less automated tests you have, the higher the risk of regressions; but it may be an acceptable trade-off under the circumstances if tests are slowing you down too much.
    There is no direct penalty for removing GUI tests. Also note our expectation on test code.

  • Reduce automated checks: You can also reduce the rigor of checkstyle checks to expedite PR processing.

  • Switch to a lighter workflow: While forking workflow is the safest, it is also rather heavy. You an switch to a simpler workflow if the forking workflow is slowing you down. Refer the textbook to find more about alternative workflows: branching workflow, centralized workflow. However, we still recommend that you use PR reviews, at least for PRs affecting others' features.

You can also increase the rigor/safety of your workflow in the following ways:

  • Use GitHub's Protected Branches feature to protect your master branch against rogue PRs.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.
 

Project Management → Revision Control →

Forking Flow

In the forking workflow, the 'official' version of the software is kept in a remote repo designated as the 'main repo'. All team members fork the main repo create pull requests from their fork to the main repo.

To illustrate how the workflow goes, let’s assume Jean wants to fix a bug in the code. Here are the steps:

  1. Jean creates a separate branch in her local repo and fixes the bug in that branch.
  2. Jean pushes the branch to her fork.
  3. Jean creates a pull request from that branch in her fork to the main repo.
  4. Other members review Jean’s pull request.
  5. If reviewers suggested any changes, Jean updates the PR accordingly.
  6. When reviewers are satisfied with the PR, one of the members (usually the team lead or a designated 'maintainer' of the main repo) merges the PR, which brings Jean’s code to the main repo.
  7. Other members, realizing there is new code in the upstream repo, sync their forks with the new upstream repo (i.e. the main repo). This is done by pulling the new code to their own local repo and pushing the updated code to their own fork.

Documentation:

Recommended procedure for updating docs:

  1. Divide among yourselves who will update which parts of the document(s).
  2. Update the team repo by following the workflow mentioned above.

Update the following pages in your project repo:

  • About Us page: This page is used for module admin purposes. Please follow the format closely or else our scripts will not be able to give credit for your work.
    • Replace info of SE-EDU developers with info of your team, including a suitable photo as described here.
    • Including the name/photo of the supervisor/lecturer is optional.
    • The photo of a team member should be doc/images/githbub_username_in_lower_case.png e.g. docs/images/damithc.png. If you photo is in jpg format, name the file as .png anyway.
    • Indicate the different roles played and responsibilities held by each team member. You can reassign these roles and responsibilities (as explained in Admin Project Scope) later in the project, if necessary.
 
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

 

Roles indicate aspects you are in charge of and responsible for. E.g., if you are in charge of documentation, you are the person who should allocate which parts of the documentation is to be done by who, ensure the document is in right format, ensure consistency etc.

This is a non-exhaustive list; you may define additional roles.

  • Team lead: Responsible for overall project coordination.
  • Documentation (short for ‘in charge of documentation’): Responsible for the quality of various project documents.
  • Testing: Ensures the testing of the project is done properly and on time.
  • Code quality: Looks after code quality, ensures adherence to coding standards, etc.
  • Deliverables and deadlines: Ensure project deliverables are done on time and in the right format.
  • Integration: In charge of versioning of the code, maintaining the code repository, integrating various parts of the software to create a whole.
  • Scheduling and tracking: In charge of defining, assigning, and tracking project tasks.
  • [Tool ABC] expert: e.g. Intellij expert, Git expert, etc. Helps other team member with matters related to the specific tool.
  • In charge of[Component XYZ]: e.g. In charge of Model, UI, Storage, etc. If you are in charge of a component, you are expected to know that component well, and review changes done to that component in v1.3-v1.4.

Please make sure each of the important roles are assigned to one person in the team. It is OK to have a 'backup' for each role, but for each aspect there should be one person who is unequivocally the person responsible for it.

  • Contact Us Page: Update to match your product.

  • README.adoc page: Update it to match your project.

    • Add a UI mockup of your intended final product.
      Note that the image of the UI should be docs/images/Ui.png so that it can be downloaded by our scripts. Limit the file to contain one screenshot/mockup only and ensure the new image is roughly the same height x width proportions as the original one. Reason: when we compile these images from all teams into one page (example), yours should not look out of place.

    • The original README.adoc file (which doubles as the landing page of your project website) is written to read like the introduction to an SE learning/teaching resource. You should restructure this page to look like the home page of a real product (not a school project) targeting real users  e.g. remove references to addressbook-level3, Learning Outcomes etc. mention target users, add a marketing blurb etc. On a related note, also remove Learning Outcomes link and related pages.

    • Update the link of the Travis build status badge (Build Status) so that it reflects the build status of your team repo.
      For the other badges,

      • either set up the respective tool for your project (AB-4 Developer Guide has instructions on how to set up AppVeyor and Coveralls) and update the badges accordingly,
      • or remove the badge.
    • Acknowledge the original source of the code i.e. AddressBook-Level4 project created by SE-EDU initiative at https://github.com/se-edu/

  • User Guide: Start moving the content from your User Guide (draft created in previous weeks) into the User Guide page in your repository. If a feature is not implemented, mark it as 'Coming in v2.0' (example).

  • Developer Guide: Similar to the User Guide, start moving the content from your Developer Guide (draft created in previous weeks) into the Developer Guide page in your team repository.

Product:

  • Each member can attempt to do a local-impact change to the code base.

    Objective: To familiarize yourself with at least one components of the product.

    Description: Divide the components among yourselves. Each member can do some small enhancements to their component(s) to learn the code of that component. Some suggested enhancements are given in the AddressBook-Level4 developer guide.

    Submission: Create PRs from your own fork to your team repo. Get it merged by following your team's workflow.

 

A. Process:

Evaluates: How well you did in project management related aspects of the project, as an individual and as a team

Based on: Supervisor observations of project milestones and GitHub data.

Milestones need to be reached the midnight before of the tutorial for it to be counted as achieved. To get a good grade for this aspect, achieve at least 60% of the recommended milestone progress.

Other criteria:

  • Good use of GitHub milestones
  • Good use of GitHub release mechanism
  • Good version control, based on the repo
  • Reasonable attempt to use the forking workflow
  • Good task definition, assignment and tracking, based on the issue tracker
  • Good use of buffers (opposite: everything at the last minute)
  • Project done iteratively and incrementally (opposite: doing most of the work in one big burst)

B. Team-based tasks:

Evaluates: how much you contributed to common team-based tasksteam-based tasks

Based on: peer evaluations and tutor observations

Relevant: [Admin Project Scope → Examples of team tasks ]

 

Here is a non-exhaustive list of team-tasks:

  1. Necessary general code enhancements e.g.,
    1. Work related to renaming the product
    2. Work related to changing the product icon
    3. Morphing the product into a different product
  2. Setting up the GitHub, Travis, AppVeyor, etc.
  3. Maintaining the issue tracker
  4. Release management
  5. Updating user/developer docs that are not specific to a feature  e.g. documenting the target user profile
  6. Incorporating more useful tools/libraries/frameworks into the product or the project workflow (e.g. automate more aspects of the project workflow using a GitHub plugin)

 
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

v1.1 Project Management

  • Fix any errors in org/repo set up  (e.g. wrong repo name).
  • Wrap up the milestone using a git tag v1.1 as explained below:
    • When the milestone deadline is near (e.g., 0.5 days before the deadline), if you think some of the ongoing work intended for the current milestone may not finish in time, reassign them to a future milestone.
    • After all changes that can be merged before the milestone deadline has been merged, use git tag feature to tag the current version with the milestone and push the tag to the team repo.

v1.1 Documentation

  • Update User Guide, Developer Guide, README, and About Us pages as described earlier in mid-v1.1 progress guide.

    Submission: merge your changes to the master branch of your repo.

 

Set up project repo, start moving UG and DG to the repo, attempt to do local-impact changes to the code base.

Project Management:

Set up the team org and the team repo as explained below:

Relevant: [Admin Appendix E(extract): Organization setup ]

 

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Relevant: [Admin Appendix E(extract): Repo setup ]

 

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

When updating code in the repo, follow the workflow explained below:

Relevant: [Admin Appendix E(extract): Workflow ]

 

Workflow

Before you do any coding for the project,

  • Ensure you have set the Git username correctly (as explained in Appendix E) in all Computers you use for coding.
  • Read our reuse policy (in Admin: Appendix B), in particular, how to give credit when you reuse code from the Internet or classmates:
 

Setting Git Username to Match GitHub Username

We use various tools to analyze your code. For us to be able to identify your commits, you should use the GitHub username as your Git username as well. If there is a mismatch, or if you use multiple user names for Git, our tools might miss some of your work and as a result you might not get credit for some of your work.

In each Computer you use for coding, after installing Git, you should set the Git username as follows.

  1. Open a command window that can run Git commands (e.g., Git bash window)
  2. Run the command git config --global user.name YOUR_GITHUB_USERNAME
    e.g., git config --global user.name JohnDoe

More info about setting Git username is here.

 

Policy on reuse

Reuse is encouraged. However, note that reuse has its own costs (such as the learning curve, additional complexity, usage restrictions, and unknown bugs). Furthermore, you will not be given credit for work done by others. Rather, you will be given credit for using work done by others.

  • You are allowed to reuse work from your classmates, subject to following conditions:
    • The work has been published by us or the authors.
    • You clearly give credit to the original author(s).
  • You are allowed to reuse work from external sources, subject to following conditions:
    • The work comes from a source of 'good standing' (such as an established open source project). This means you cannot reuse code written by an outside 'friend'.
    • You clearly give credit to the original author. Acknowledge use of third party resources clearly e.g. in the welcome message, splash screen (if any) or under the 'about' menu. If you are open about reuse, you are less likely to get into trouble if you unintentionally reused something copyrighted.
    • You do not violate the license under which the work has been released. Please  do not use 3rd-party images/audio in your software unless they have been specifically released to be used freely. Just because you found it in the Internet does not mean it is free for reuse.
    • Always get permission from us before you reuse third-party libraries. Please post your 'request to use 3rd party library' in our forum. That way, the whole class get to see what libraries are being used by others.

Giving credit for reused work

Given below are how to give credit for things you reuse from elsewhere. These requirements are specific to this module  i.e., not applicable outside the module (outside the module you should follow the rules specified by your employer and the license of the reused work)

If you used a third party library:

  • Mention in the README.adoc (under the Acknowledgements section)
  • mention in the Project Portfolio Page if the library has a significant relevance to the features you implemented

If you reused code snippets found on the Internet  e.g. from StackOverflow answers or
referred code in another software or
referred project code by current/past student:

  • If you read the code to understand the approach and implemented it yourself, mention it as a comment
    Example:
    //Solution below adapted from https://stackoverflow.com/a/16252290
    {Your implmentation of the reused solution here ...}
    
  • If you copy-pasted a non-trivial code block (possibly with minor modifications  renaming, layout changes, changes to comments, etc.), also mark the code block as reused code (using @@author tags)
    Format:
    //@@author {yourGithubUsername}-reused
    //{Info about the source...}
    
    {Reused code (possibly with minor modifications) here ...}
    
    //@@author
    
    Example of reusing a code snippet (with minor modifications):
    persons = getList()
    //@@author johndoe-reused
    //Reused from https://stackoverflow.com/a/34646172 with minor modifications
    Collections.sort(persons, new Comparator<CustomData>() {
        @Override
        public int compare(CustomData lhs, CustomData rhs) {
            return lhs.customInt > rhs.customInt ? -1 : (lhs.customInt < rhs.customInt) ? 1 : 0;
        }
    });
    //@@author
    return persons;
    
 

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Follow the forking workflow in your project up to v1.1. In particular,

  • Get team members to review PRs. A workflow without PR reviews is a risky workflow.
  • Do not merge PRs failing CI. After setting up Travis, the CI status of a PR is reported at the bottom of the PR page. The screenshot below shows the status of a PR that is passing all CI checks.

    If there is a failure, you can click on the Details link in corresponding line to find out more about the failure. Once you figure out the cause of the failure, push the a fix to the PR.
  • After setting up Netlify, you can use Netlify PR Preview to preview changes to documentation files, if the PR contains updates to documentation. To see the preview, click on the Details link in front of the Netlify status reported (refer screenshot above).

After completing v1.1, you can adjust process rigor to suit your team's pace, as explained below.

  • Reduce automated tests have benefits, but they can be a pain to write/maintain; GUI tests are especially hard to maintain because their behavior can sometimes depend on things such as the OS, resolution etc.
    It is OK to get rid of some of the troublesome tests and rely more on manual testing instead. The less automated tests you have, the higher the risk of regressions; but it may be an acceptable trade-off under the circumstances if tests are slowing you down too much.
    There is no direct penalty for removing GUI tests. Also note our expectation on test code.

  • Reduce automated checks: You can also reduce the rigor of checkstyle checks to expedite PR processing.

  • Switch to a lighter workflow: While forking workflow is the safest, it is also rather heavy. You an switch to a simpler workflow if the forking workflow is slowing you down. Refer the textbook to find more about alternative workflows: branching workflow, centralized workflow. However, we still recommend that you use PR reviews, at least for PRs affecting others' features.

You can also increase the rigor/safety of your workflow in the following ways:

  • Use GitHub's Protected Branches feature to protect your master branch against rogue PRs.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.
 

Project Management → Revision Control →

Forking Flow

In the forking workflow, the 'official' version of the software is kept in a remote repo designated as the 'main repo'. All team members fork the main repo create pull requests from their fork to the main repo.

To illustrate how the workflow goes, let’s assume Jean wants to fix a bug in the code. Here are the steps:

  1. Jean creates a separate branch in her local repo and fixes the bug in that branch.
  2. Jean pushes the branch to her fork.
  3. Jean creates a pull request from that branch in her fork to the main repo.
  4. Other members review Jean’s pull request.
  5. If reviewers suggested any changes, Jean updates the PR accordingly.
  6. When reviewers are satisfied with the PR, one of the members (usually the team lead or a designated 'maintainer' of the main repo) merges the PR, which brings Jean’s code to the main repo.
  7. Other members, realizing there is new code in the upstream repo, sync their forks with the new upstream repo (i.e. the main repo). This is done by pulling the new code to their own local repo and pushing the updated code to their own fork.

Documentation:

Recommended procedure for updating docs:

  1. Divide among yourselves who will update which parts of the document(s).
  2. Update the team repo by following the workflow mentioned above.

Update the following pages in your project repo:

  • About Us page: This page is used for module admin purposes. Please follow the format closely or else our scripts will not be able to give credit for your work.
    • Replace info of SE-EDU developers with info of your team, including a suitable photo as described here.
    • Including the name/photo of the supervisor/lecturer is optional.
    • The photo of a team member should be doc/images/githbub_username_in_lower_case.png e.g. docs/images/damithc.png. If you photo is in jpg format, name the file as .png anyway.
    • Indicate the different roles played and responsibilities held by each team member. You can reassign these roles and responsibilities (as explained in Admin Project Scope) later in the project, if necessary.
 
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

 

Roles indicate aspects you are in charge of and responsible for. E.g., if you are in charge of documentation, you are the person who should allocate which parts of the documentation is to be done by who, ensure the document is in right format, ensure consistency etc.

This is a non-exhaustive list; you may define additional roles.

  • Team lead: Responsible for overall project coordination.
  • Documentation (short for ‘in charge of documentation’): Responsible for the quality of various project documents.
  • Testing: Ensures the testing of the project is done properly and on time.
  • Code quality: Looks after code quality, ensures adherence to coding standards, etc.
  • Deliverables and deadlines: Ensure project deliverables are done on time and in the right format.
  • Integration: In charge of versioning of the code, maintaining the code repository, integrating various parts of the software to create a whole.
  • Scheduling and tracking: In charge of defining, assigning, and tracking project tasks.
  • [Tool ABC] expert: e.g. Intellij expert, Git expert, etc. Helps other team member with matters related to the specific tool.
  • In charge of[Component XYZ]: e.g. In charge of Model, UI, Storage, etc. If you are in charge of a component, you are expected to know that component well, and review changes done to that component in v1.3-v1.4.

Please make sure each of the important roles are assigned to one person in the team. It is OK to have a 'backup' for each role, but for each aspect there should be one person who is unequivocally the person responsible for it.

  • Contact Us Page: Update to match your product.

  • README.adoc page: Update it to match your project.

    • Add a UI mockup of your intended final product.
      Note that the image of the UI should be docs/images/Ui.png so that it can be downloaded by our scripts. Limit the file to contain one screenshot/mockup only and ensure the new image is roughly the same height x width proportions as the original one. Reason: when we compile these images from all teams into one page (example), yours should not look out of place.

    • The original README.adoc file (which doubles as the landing page of your project website) is written to read like the introduction to an SE learning/teaching resource. You should restructure this page to look like the home page of a real product (not a school project) targeting real users  e.g. remove references to addressbook-level3, Learning Outcomes etc. mention target users, add a marketing blurb etc. On a related note, also remove Learning Outcomes link and related pages.

    • Update the link of the Travis build status badge (Build Status) so that it reflects the build status of your team repo.
      For the other badges,

      • either set up the respective tool for your project (AB-4 Developer Guide has instructions on how to set up AppVeyor and Coveralls) and update the badges accordingly,
      • or remove the badge.
    • Acknowledge the original source of the code i.e. AddressBook-Level4 project created by SE-EDU initiative at https://github.com/se-edu/

  • User Guide: Start moving the content from your User Guide (draft created in previous weeks) into the User Guide page in your repository. If a feature is not implemented, mark it as 'Coming in v2.0' (example).

  • Developer Guide: Similar to the User Guide, start moving the content from your Developer Guide (draft created in previous weeks) into the Developer Guide page in your team repository.

Product:

  • Each member can attempt to do a local-impact change to the code base.

    Objective: To familiarize yourself with at least one components of the product.

    Description: Divide the components among yourselves. Each member can do some small enhancements to their component(s) to learn the code of that component. Some suggested enhancements are given in the AddressBook-Level4 developer guide.

    Submission: Create PRs from your own fork to your team repo. Get it merged by following your team's workflow.

v1.1 Product

  • Each member should try to add some enhancements that are in line with the vision for v2.0. After adding some local-impact changes as recommended in mid-v1.1 progress guide, attempt to do some global-impact enhancements , touching as many other components as possible. Refer to the AddressBook-Level4 Developer Guide has some guidance on how to implement a new feature end-to-end.


Project: mid-v1.2 [week 8]

Adjust project schedule/rigor as needed, start proper milestone management.

Project Management:

💡 You are free to adjust process rigor and project plan at any future time in the project, starting from v1.2. If you are not sure if a certain adjustment is allowed, you can check with the teaching team first.

  • Switch to AB-3 or AB-2 if AB-4 is not working out for you.

Relevant: [Admin Project Deliverables → Notes for Those Using AB-2 or AB-3 for the Project ]

 

There is no explicit penalty for switching to a lower level AB. All projects are evaluated based on the same yardstick irrespective of on which AB it is based. As an AB is given to you as a 'free' head-start, a lower level AB gives you a shorter head-start, which means your final product is likely to be less functional than those from teams using AB-4 unless you progress faster than them. Nevertheless, you should switch to AB2/3 if you feel you can learn more from the project that way, as our goal is to maximize learning, not features.
If your team wants to stay with AB-4 but you want to switch to a lower level AB, let the us know so that we can work something out for you.

If you have opted to use AB-2 or AB-3 instead of AB-4 as the basis of your product, please note the following points:

 

Set up project repo, start moving UG and DG to the repo, attempt to do local-impact changes to the code base.

Project Management:

Set up the team org and the team repo as explained below:

Relevant: [Admin Appendix E(extract): Organization setup ]

 

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Relevant: [Admin Appendix E(extract): Repo setup ]

 

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

When updating code in the repo, follow the workflow explained below:

Relevant: [Admin Appendix E(extract): Workflow ]

 

Workflow

Before you do any coding for the project,

  • Ensure you have set the Git username correctly (as explained in Appendix E) in all Computers you use for coding.
  • Read our reuse policy (in Admin: Appendix B), in particular, how to give credit when you reuse code from the Internet or classmates:
 

Setting Git Username to Match GitHub Username

We use various tools to analyze your code. For us to be able to identify your commits, you should use the GitHub username as your Git username as well. If there is a mismatch, or if you use multiple user names for Git, our tools might miss some of your work and as a result you might not get credit for some of your work.

In each Computer you use for coding, after installing Git, you should set the Git username as follows.

  1. Open a command window that can run Git commands (e.g., Git bash window)
  2. Run the command git config --global user.name YOUR_GITHUB_USERNAME
    e.g., git config --global user.name JohnDoe

More info about setting Git username is here.

 

Policy on reuse

Reuse is encouraged. However, note that reuse has its own costs (such as the learning curve, additional complexity, usage restrictions, and unknown bugs). Furthermore, you will not be given credit for work done by others. Rather, you will be given credit for using work done by others.

  • You are allowed to reuse work from your classmates, subject to following conditions:
    • The work has been published by us or the authors.
    • You clearly give credit to the original author(s).
  • You are allowed to reuse work from external sources, subject to following conditions:
    • The work comes from a source of 'good standing' (such as an established open source project). This means you cannot reuse code written by an outside 'friend'.
    • You clearly give credit to the original author. Acknowledge use of third party resources clearly e.g. in the welcome message, splash screen (if any) or under the 'about' menu. If you are open about reuse, you are less likely to get into trouble if you unintentionally reused something copyrighted.
    • You do not violate the license under which the work has been released. Please  do not use 3rd-party images/audio in your software unless they have been specifically released to be used freely. Just because you found it in the Internet does not mean it is free for reuse.
    • Always get permission from us before you reuse third-party libraries. Please post your 'request to use 3rd party library' in our forum. That way, the whole class get to see what libraries are being used by others.

Giving credit for reused work

Given below are how to give credit for things you reuse from elsewhere. These requirements are specific to this module  i.e., not applicable outside the module (outside the module you should follow the rules specified by your employer and the license of the reused work)

If you used a third party library:

  • Mention in the README.adoc (under the Acknowledgements section)
  • mention in the Project Portfolio Page if the library has a significant relevance to the features you implemented

If you reused code snippets found on the Internet  e.g. from StackOverflow answers or
referred code in another software or
referred project code by current/past student:

  • If you read the code to understand the approach and implemented it yourself, mention it as a comment
    Example:
    //Solution below adapted from https://stackoverflow.com/a/16252290
    {Your implmentation of the reused solution here ...}
    
  • If you copy-pasted a non-trivial code block (possibly with minor modifications  renaming, layout changes, changes to comments, etc.), also mark the code block as reused code (using @@author tags)
    Format:
    //@@author {yourGithubUsername}-reused
    //{Info about the source...}
    
    {Reused code (possibly with minor modifications) here ...}
    
    //@@author
    
    Example of reusing a code snippet (with minor modifications):
    persons = getList()
    //@@author johndoe-reused
    //Reused from https://stackoverflow.com/a/34646172 with minor modifications
    Collections.sort(persons, new Comparator<CustomData>() {
        @Override
        public int compare(CustomData lhs, CustomData rhs) {
            return lhs.customInt > rhs.customInt ? -1 : (lhs.customInt < rhs.customInt) ? 1 : 0;
        }
    });
    //@@author
    return persons;
    
 

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Follow the forking workflow in your project up to v1.1. In particular,

  • Get team members to review PRs. A workflow without PR reviews is a risky workflow.
  • Do not merge PRs failing CI. After setting up Travis, the CI status of a PR is reported at the bottom of the PR page. The screenshot below shows the status of a PR that is passing all CI checks.

    If there is a failure, you can click on the Details link in corresponding line to find out more about the failure. Once you figure out the cause of the failure, push the a fix to the PR.
  • After setting up Netlify, you can use Netlify PR Preview to preview changes to documentation files, if the PR contains updates to documentation. To see the preview, click on the Details link in front of the Netlify status reported (refer screenshot above).

After completing v1.1, you can adjust process rigor to suit your team's pace, as explained below.

  • Reduce automated tests have benefits, but they can be a pain to write/maintain; GUI tests are especially hard to maintain because their behavior can sometimes depend on things such as the OS, resolution etc.
    It is OK to get rid of some of the troublesome tests and rely more on manual testing instead. The less automated tests you have, the higher the risk of regressions; but it may be an acceptable trade-off under the circumstances if tests are slowing you down too much.
    There is no direct penalty for removing GUI tests. Also note our expectation on test code.

  • Reduce automated checks: You can also reduce the rigor of checkstyle checks to expedite PR processing.

  • Switch to a lighter workflow: While forking workflow is the safest, it is also rather heavy. You an switch to a simpler workflow if the forking workflow is slowing you down. Refer the textbook to find more about alternative workflows: branching workflow, centralized workflow. However, we still recommend that you use PR reviews, at least for PRs affecting others' features.

You can also increase the rigor/safety of your workflow in the following ways:

  • Use GitHub's Protected Branches feature to protect your master branch against rogue PRs.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.
 

Project Management → Revision Control →

Forking Flow

In the forking workflow, the 'official' version of the software is kept in a remote repo designated as the 'main repo'. All team members fork the main repo create pull requests from their fork to the main repo.

To illustrate how the workflow goes, let’s assume Jean wants to fix a bug in the code. Here are the steps:

  1. Jean creates a separate branch in her local repo and fixes the bug in that branch.
  2. Jean pushes the branch to her fork.
  3. Jean creates a pull request from that branch in her fork to the main repo.
  4. Other members review Jean’s pull request.
  5. If reviewers suggested any changes, Jean updates the PR accordingly.
  6. When reviewers are satisfied with the PR, one of the members (usually the team lead or a designated 'maintainer' of the main repo) merges the PR, which brings Jean’s code to the main repo.
  7. Other members, realizing there is new code in the upstream repo, sync their forks with the new upstream repo (i.e. the main repo). This is done by pulling the new code to their own local repo and pushing the updated code to their own fork.

Documentation:

Recommended procedure for updating docs:

  1. Divide among yourselves who will update which parts of the document(s).
  2. Update the team repo by following the workflow mentioned above.

Update the following pages in your project repo:

  • About Us page: This page is used for module admin purposes. Please follow the format closely or else our scripts will not be able to give credit for your work.
    • Replace info of SE-EDU developers with info of your team, including a suitable photo as described here.
    • Including the name/photo of the supervisor/lecturer is optional.
    • The photo of a team member should be doc/images/githbub_username_in_lower_case.png e.g. docs/images/damithc.png. If you photo is in jpg format, name the file as .png anyway.
    • Indicate the different roles played and responsibilities held by each team member. You can reassign these roles and responsibilities (as explained in Admin Project Scope) later in the project, if necessary.
 
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

 

Roles indicate aspects you are in charge of and responsible for. E.g., if you are in charge of documentation, you are the person who should allocate which parts of the documentation is to be done by who, ensure the document is in right format, ensure consistency etc.

This is a non-exhaustive list; you may define additional roles.

  • Team lead: Responsible for overall project coordination.
  • Documentation (short for ‘in charge of documentation’): Responsible for the quality of various project documents.
  • Testing: Ensures the testing of the project is done properly and on time.
  • Code quality: Looks after code quality, ensures adherence to coding standards, etc.
  • Deliverables and deadlines: Ensure project deliverables are done on time and in the right format.
  • Integration: In charge of versioning of the code, maintaining the code repository, integrating various parts of the software to create a whole.
  • Scheduling and tracking: In charge of defining, assigning, and tracking project tasks.
  • [Tool ABC] expert: e.g. Intellij expert, Git expert, etc. Helps other team member with matters related to the specific tool.
  • In charge of[Component XYZ]: e.g. In charge of Model, UI, Storage, etc. If you are in charge of a component, you are expected to know that component well, and review changes done to that component in v1.3-v1.4.

Please make sure each of the important roles are assigned to one person in the team. It is OK to have a 'backup' for each role, but for each aspect there should be one person who is unequivocally the person responsible for it.

  • Contact Us Page: Update to match your product.

  • README.adoc page: Update it to match your project.

    • Add a UI mockup of your intended final product.
      Note that the image of the UI should be docs/images/Ui.png so that it can be downloaded by our scripts. Limit the file to contain one screenshot/mockup only and ensure the new image is roughly the same height x width proportions as the original one. Reason: when we compile these images from all teams into one page (example), yours should not look out of place.

    • The original README.adoc file (which doubles as the landing page of your project website) is written to read like the introduction to an SE learning/teaching resource. You should restructure this page to look like the home page of a real product (not a school project) targeting real users  e.g. remove references to addressbook-level3, Learning Outcomes etc. mention target users, add a marketing blurb etc. On a related note, also remove Learning Outcomes link and related pages.

    • Update the link of the Travis build status badge (Build Status) so that it reflects the build status of your team repo.
      For the other badges,

      • either set up the respective tool for your project (AB-4 Developer Guide has instructions on how to set up AppVeyor and Coveralls) and update the badges accordingly,
      • or remove the badge.
    • Acknowledge the original source of the code i.e. AddressBook-Level4 project created by SE-EDU initiative at https://github.com/se-edu/

  • User Guide: Start moving the content from your User Guide (draft created in previous weeks) into the User Guide page in your repository. If a feature is not implemented, mark it as 'Coming in v2.0' (example).

  • Developer Guide: Similar to the User Guide, start moving the content from your Developer Guide (draft created in previous weeks) into the Developer Guide page in your team repository.

Product:

  • Each member can attempt to do a local-impact change to the code base.

    Objective: To familiarize yourself with at least one components of the product.

    Description: Divide the components among yourselves. Each member can do some small enhancements to their component(s) to learn the code of that component. Some suggested enhancements are given in the AddressBook-Level4 developer guide.

    Submission: Create PRs from your own fork to your team repo. Get it merged by following your team's workflow.

  • Adjust process rigor to suit your team's pace, as explained in the panel below.

Relevant: [Admin Appendix E(extract): Workflow (after v1.1) ]

 

After completing v1.1, you can adjust process rigor to suit your team's pace, as explained below.

  • Reduce automated tests have benefits, but they can be a pain to write/maintain; GUI tests are especially hard to maintain because their behavior can sometimes depend on things such as the OS, resolution etc.
    It is OK to get rid of some of the troublesome tests and rely more on manual testing instead. The less automated tests you have, the higher the risk of regressions; but it may be an acceptable trade-off under the circumstances if tests are slowing you down too much.
    There is no direct penalty for removing GUI tests. Also note our expectation on test code.

  • Reduce automated checks: You can also reduce the rigor of checkstyle checks to expedite PR processing.

  • Switch to a lighter workflow: While forking workflow is the safest, it is also rather heavy. You an switch to a simpler workflow if the forking workflow is slowing you down. Refer the textbook to find more about alternative workflows: branching workflow, centralized workflow. However, we still recommend that you use PR reviews, at least for PRs affecting others' features.

You can also increase the rigor/safety of your workflow in the following ways:

  • Use GitHub's Protected Branches feature to protect your master branch against rogue PRs.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.

  • Adjust project plan if necessary. Now that you have a some idea about the code base, revisit the feature release plan and adjust it if necessary.

  • Set up the issue tracker as described in the panel below, if you haven't done so already.

Relevant: [Admin Appendix E(extract): Setting up the issue tracker ]

 

Issue tracker setup

We recommend you configure the issue tracker of the main repo as follows:

  • Delete existing labels and add the following labels.
    💡 Issue type labels are useful from the beginning of the project. The other labels are needed only when you start implementing the features.

Issue type labels:

  • type.Epic : A big feature which can be broken down into smaller stories e.g. search
  • type.Story : A user story
  • type.Enhancement: An enhancement to an existing story
  • type.Task : Something that needs to be done, but not a story, bug, or an epic. e.g. Move testing code into a new folder)
  • type.Bug : A bug

Status labels:

  • status.Ongoing : The issue is currently being worked on. note: remove this label before closing an issue.

Priority labels:

  • priority.High : Must do
  • priority.Medium : Nice to have
  • priority.Low : Unlikely to do

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Create following milestones : v1.0v1.1v1.2v1.3v1.4,

  • You may configure other project settings as you wish. e.g. more labels, more milestones

  • Start proper schedule tracking and milestone management as explained in the panel below.

Relevant: [Admin Appendix E(extract): Project schedule tracking ]

 

Project Schedule Tracking

In general, use the issue tracker (Milestones, Issues, PRs, Tags, Releases, and Labels) for assigning, scheduling, and tracking all noteworthy project tasks, including user stories. Update the issue tracker regularly to reflect the current status of the project. You can also use GitHub's Projects feature to manage the project, but keep it linked to the issue tracker as much as you can.

Using Issues:

During the initial stages (latest by the start of v1.2):

  • Record each of the user stories you plan to deliver as an issue in the issue tracker. e.g. Title: As a user I can add a deadline
    Description: ... so that I can keep track of my deadlines

  • Assign the type.* and priority.* labels to those issues.

  • Formalize the project plan by assigning relevant issues to the corresponding milestone.

From milestone v1.2:

  • Define project tasks as issues. When you start implementing a user story (or a feature), break it down to smaller tasks if necessary. Define reasonable sized, standalone tasks. Create issues for each of those tasks so that they can be tracked.e.g.

    • A typical task should be able to done by one person, in a few hours.

      • Bad (reasons: not a one-person task, not small enough): Write the Developer Guide
      • Good: Update class diagram in the Developer Guide for v1.4
    • There is no need to break things into VERY small tasks. Keep them as big as possible, but they should be no bigger than what you are going to assign a single person to do within a week. eg.,

      • Bad:Implementing parser (reason: too big).
      • Good:Implementing parser support for adding of floating tasks
    • Do not track things taken for granted. e.g., push code to repo should not be a task to track. In the example given under the previous point, it is taken for granted that the owner will also (a) test the code and (b) push to the repo when it is ready. Those two need not be tracked as separate tasks.

    • Write a descriptive title for the issue. e.g. Add support for the 'undo' command to the parser

      • Omit redundant details. In some cases, the issue title is enough to describe the task. In that case, no need to repeat it in the issue description. There is no need for well-crafted and detailed descriptions for tasks. A minimal description is enough. Similarly, labels such as priority can be omitted if you think they don't help you.

  • Assign tasks (i.e., issues) to the corresponding team members using the assignees field. Normally, there should be some ongoing tasks and some pending tasks against each team member at any point.

  • Optionally, you can use status.ongoing label to indicate issues currently ongoing.

Using Milestones:

We recommend you do proper milestone management starting from v1.2. Given below are the conditions to satisfy for a milestone to be considered properly managed:

Planning a Milestone:

  • Issues assigned to the milestone, team members assigned to issues: Used GitHub milestones to indicate which issues are to be handled for which milestone by assigning issues to suitable milestones. Also make sure those issues are assigned to team members. Note that you can change the milestone plan along the way as necessary.

  • Deadline set for the milestones (in the GitHub milestone). Your internal milestones can be set earlier than the deadlines we have set, to give you a buffer.

Wrapping up a Milestone:

  • A working product tagged with the correct tag (e.g. v1.2) and is pushed to the main repo
    or a product release done on GitHub. A product release is optional for v1.2 but required from from v1.3. Click here to see an example release.

  • All tests passing on Travis for the version tagged/released.

  • Milestone updated to match the product i.e. all issues completed and PRs merged for the milestone should be assigned to the milestone. Incomplete issues/PRs should be moved to a future milestone.

  • Milestone closed.

  • If necessary, future milestones are revised based on what you experienced in the current milestone  e.g. if you could not finish all issues assigned to the current milestone, it is a sign that you overestimated how much you can do in a week, which means you might want to reduce the issues assigned to future milestones to match that observation.

Product:

  • From v1.2 onwards each member is expected to contribute some code to each milestone, preferably each week; only merged code is considered as contributions (Reason) .
    If an enhancement is too big to complete in one milestone, try to deliver it in smaller incremental steps e.g. deliver a basic version of the enhancement first.


Project: v1.2 [week 9]

Move code towards v2.0 in small steps, start documenting design/implementation details in DG.

v1.2 Summary of Milestone

Milestone Minimum acceptable performance to consider as 'reached'
Contributed code to the product as described in mid-v1.2 progress guide some code merged
Described implementation details in the Developer Guide some text and some diagrams added to the developer guide (at least in a PR), comprising at least one page worth of content
Issue tracker set up As explained in [Admin Appendix E: GitHub: Issue Tracker Setup].
v1.2 managed using GitHub features (issue tracker, milestones, etc.) Milestone v1.2 managed as explained in [Admin Appendix E: GitHub: Project Schedule Tracking].
 

Issue tracker setup

We recommend you configure the issue tracker of the main repo as follows:

  • Delete existing labels and add the following labels.
    💡 Issue type labels are useful from the beginning of the project. The other labels are needed only when you start implementing the features.

Issue type labels:

  • type.Epic : A big feature which can be broken down into smaller stories e.g. search
  • type.Story : A user story
  • type.Enhancement: An enhancement to an existing story
  • type.Task : Something that needs to be done, but not a story, bug, or an epic. e.g. Move testing code into a new folder)
  • type.Bug : A bug

Status labels:

  • status.Ongoing : The issue is currently being worked on. note: remove this label before closing an issue.

Priority labels:

  • priority.High : Must do
  • priority.Medium : Nice to have
  • priority.Low : Unlikely to do

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Create following milestones : v1.0v1.1v1.2v1.3v1.4,

  • You may configure other project settings as you wish. e.g. more labels, more milestones

 

Project Schedule Tracking

In general, use the issue tracker (Milestones, Issues, PRs, Tags, Releases, and Labels) for assigning, scheduling, and tracking all noteworthy project tasks, including user stories. Update the issue tracker regularly to reflect the current status of the project. You can also use GitHub's Projects feature to manage the project, but keep it linked to the issue tracker as much as you can.

Using Issues:

During the initial stages (latest by the start of v1.2):

  • Record each of the user stories you plan to deliver as an issue in the issue tracker. e.g. Title: As a user I can add a deadline
    Description: ... so that I can keep track of my deadlines

  • Assign the type.* and priority.* labels to those issues.

  • Formalize the project plan by assigning relevant issues to the corresponding milestone.

From milestone v1.2:

  • Define project tasks as issues. When you start implementing a user story (or a feature), break it down to smaller tasks if necessary. Define reasonable sized, standalone tasks. Create issues for each of those tasks so that they can be tracked.e.g.

    • A typical task should be able to done by one person, in a few hours.

      • Bad (reasons: not a one-person task, not small enough): Write the Developer Guide
      • Good: Update class diagram in the Developer Guide for v1.4
    • There is no need to break things into VERY small tasks. Keep them as big as possible, but they should be no bigger than what you are going to assign a single person to do within a week. eg.,

      • Bad:Implementing parser (reason: too big).
      • Good:Implementing parser support for adding of floating tasks
    • Do not track things taken for granted. e.g., push code to repo should not be a task to track. In the example given under the previous point, it is taken for granted that the owner will also (a) test the code and (b) push to the repo when it is ready. Those two need not be tracked as separate tasks.

    • Write a descriptive title for the issue. e.g. Add support for the 'undo' command to the parser

      • Omit redundant details. In some cases, the issue title is enough to describe the task. In that case, no need to repeat it in the issue description. There is no need for well-crafted and detailed descriptions for tasks. A minimal description is enough. Similarly, labels such as priority can be omitted if you think they don't help you.

  • Assign tasks (i.e., issues) to the corresponding team members using the assignees field. Normally, there should be some ongoing tasks and some pending tasks against each team member at any point.

  • Optionally, you can use status.ongoing label to indicate issues currently ongoing.

Using Milestones:

We recommend you do proper milestone management starting from v1.2. Given below are the conditions to satisfy for a milestone to be considered properly managed:

Planning a Milestone:

  • Issues assigned to the milestone, team members assigned to issues: Used GitHub milestones to indicate which issues are to be handled for which milestone by assigning issues to suitable milestones. Also make sure those issues are assigned to team members. Note that you can change the milestone plan along the way as necessary.

  • Deadline set for the milestones (in the GitHub milestone). Your internal milestones can be set earlier than the deadlines we have set, to give you a buffer.

Wrapping up a Milestone:

  • A working product tagged with the correct tag (e.g. v1.2) and is pushed to the main repo
    or a product release done on GitHub. A product release is optional for v1.2 but required from from v1.3. Click here to see an example release.

  • All tests passing on Travis for the version tagged/released.

  • Milestone updated to match the product i.e. all issues completed and PRs merged for the milestone should be assigned to the milestone. Incomplete issues/PRs should be moved to a future milestone.

  • Milestone closed.

  • If necessary, future milestones are revised based on what you experienced in the current milestone  e.g. if you could not finish all issues assigned to the current milestone, it is a sign that you overestimated how much you can do in a week, which means you might want to reduce the issues assigned to future milestones to match that observation.

v1.2 Project Management

  • Manage the milestone v1.2 as explained in [Admin Appendix E: GitHub: Project Schedule Tracking].

v1.2 Product

  • Merge some code into the master branch of your team repo.

v1.2 Documentation

  • User Guide: Update as necessary.

    • If a feature has been released in this version, remove the Coming in v2.0 annotation from that feature. Also replace UI mock-ups with actual screenshots.
    • If a feature design has changed, update the descriptions accordingly.
  • Developer Guide:

    • Each member should describe the implementation of at least one enhancement she has added (or planning to add).
      Expected length: 1+ page per person
    • The description can contain things such as,
      • How the feature is implemented.
      • Why it is implemented that way.
      • Alternatives considered.
    • The stated objective is to explain the implementation to a future developer, but a hidden objective is to show evidence that you can document deeply-technical content using prose, examples, diagrams, code snippets, etc. appropriately. To that end, you may also describe features that you plan to implement in the future, even beyond v1.4 (hypothetically).
    • For an example, see the description of the undo/redo feature implementation in the AddressBook-Level4 developer guide.

v1.2 Demo

Do an informal demo of the new feature during the tutorial. To save time, we recommend that one member demos all new features, using the commit tagged as v1.2 in the master branch  i.e. only features included in the current release should be demoed.



Project: mid-v1.3 [week 10]

Continue to enhance features. Make code RepoSense-compatible. Try doing a proper release.

Project Management:

Ensure your code is RepoSense-compatible, as explained below:

In previous semesters we asked students to annotate all their code using special @@author tags so that we can extract each student's code for grading. This semester, we are trying out a new tool called RepoSense that is expected to reduce the need for such tagging, and also make it easier for you to see (and learn from) code written by others.

Figure: RepoSense Report Features

1. View the current status of code authorship data:

  • The report generated by the tool is available at Project Code Dashboard (BETA). The feature that is most relevant to you is the Code Panel (shown on the right side of the screenshot above). It shows the code attributed to a given author. You are welcome to play around with the other features (they are still under development and will not be used for grading this semester).
  • Click on your name to load the code attributed to you (based on Git blame/log data) onto the code panel on the right.
  • If the code shown roughly matches the code you wrote, all is fine and there is nothing for you to do.

2. If the code does not match:

  • Here are the possible reasons for the code shown not to match the code you wrote:

    • the git username in some of your commits does not match your GitHub username (perhaps you missed our instructions to set your Git username to match GitHub username earlier in the project, or GitHub did not honor your Git username for some reason)
    • the actual authorship does not match the authorship determined by git blame/log e.g., another student touched your code after you wrote it, and Git log attributed the code to that student instead
  • In those cases,

    • Install RepoSense (see the Getting Started section of the RepoSense User Guide)
    • Use the two methods described in the RepoSense User Guide section Configuring a Repo to Provide Additional Data to RepoSense to provide additional data to the authorship analysis to make it more accurate.
    • If you add a config.json file to your repo (as specified by one of the two methods),
      • Please use the template json file given in the module website so that your display name matches the name we expect it to be.
      • If your commits have multiple author names, specify all of them e.g., "authorNames": ["theMyth", "theLegend", "theGary"]
      • Update the line config.json in the .gitignore file of your repo as /config.json so that it ignores the config.json produced by the app but not the _reposense/config.json.
    • If you add @@author annotations, please follow the guidelines below:

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

  • After you are satisfied with the new results (i.e., results produced by running RepoSense locally), push the config.json file you added and/or the annotated code to your repo. We'll use that information the next time we run RepoSense (we run it at least once a week).
  • If you choose to annotate code, please annotate code chunks not smaller than a method. We do not grade code snippets smaller than a method.
  • If you encounter any problem when doing the above or if you have questions, please post in the forum.

We recommend you ensure your code is RepoSense-compatible by v1.3

Product:

  • Do a proper product release as described in the Developer Guide. You can name it something like v1.2.1. Ensure that the jar file works as expected by doing some manual testing. Reason: You are required to do a proper product release for v1.3. Doing a trial at this point will help you iron out any problems in advance. It may take additional effort to get the jar working especially if you use third party libraries or additional assets such as images.

Documentation:

  • User Guide: Update where the document does not match the current product.
  • Developer Guide: Similar to the User Guide.


Project: v1.3 [week 11]

Release as a jar file, release updated user guide, peer-test released products, verify code authorship.

v1.3 Summary of Milestone

Milestone Minimum acceptable performance to consider as 'reached'
Contributed code to v1.3 code merged
Code is RepoSense-compatible as stated in mid-v1.3
v1.3 jar file released on GitHub as stated
v1.3 milestone properly wrapped up on GitHub as stated
Documentation updated to match v1.3 at least the User Guide and the README.adoc is updated

v1.3 Project Management

Ensure your code is RepoSense-compatible, as explained in mid-v1.3.

 

Using RepoSense

In previous semesters we asked students to annotate all their code using special @@author tags so that we can extract each student's code for grading. This semester, we are trying out a new tool called RepoSense that is expected to reduce the need for such tagging, and also make it easier for you to see (and learn from) code written by others.

Figure: RepoSense Report Features

1. View the current status of code authorship data:

  • The report generated by the tool is available at Project Code Dashboard (BETA). The feature that is most relevant to you is the Code Panel (shown on the right side of the screenshot above). It shows the code attributed to a given author. You are welcome to play around with the other features (they are still under development and will not be used for grading this semester).
  • Click on your name to load the code attributed to you (based on Git blame/log data) onto the code panel on the right.
  • If the code shown roughly matches the code you wrote, all is fine and there is nothing for you to do.

2. If the code does not match:

  • Here are the possible reasons for the code shown not to match the code you wrote:

    • the git username in some of your commits does not match your GitHub username (perhaps you missed our instructions to set your Git username to match GitHub username earlier in the project, or GitHub did not honor your Git username for some reason)
    • the actual authorship does not match the authorship determined by git blame/log e.g., another student touched your code after you wrote it, and Git log attributed the code to that student instead
  • In those cases,

    • Install RepoSense (see the Getting Started section of the RepoSense User Guide)
    • Use the two methods described in the RepoSense User Guide section Configuring a Repo to Provide Additional Data to RepoSense to provide additional data to the authorship analysis to make it more accurate.
    • If you add a config.json file to your repo (as specified by one of the two methods),
      • Please use the template json file given in the module website so that your display name matches the name we expect it to be.
      • If your commits have multiple author names, specify all of them e.g., "authorNames": ["theMyth", "theLegend", "theGary"]
      • Update the line config.json in the .gitignore file of your repo as /config.json so that it ignores the config.json produced by the app but not the _reposense/config.json.
    • If you add @@author annotations, please follow the guidelines below:

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

  • After you are satisfied with the new results (i.e., results produced by running RepoSense locally), push the config.json file you added and/or the annotated code to your repo. We'll use that information the next time we run RepoSense (we run it at least once a week).
  • If you choose to annotate code, please annotate code chunks not smaller than a method. We do not grade code snippets smaller than a method.
  • If you encounter any problem when doing the above or if you have questions, please post in the forum.

We recommend you ensure your code is RepoSense-compatible by v1.3

v1.3 Product

v1.3 Documentation

v1.3 user guide should be updated to match the current version of the product.  Reason: v1.3 will be subjected to a trial acceptance testing session

  • README page: Update to look like a real product (rather than a project for learning SE) if you haven't done so already. In particular, update the Ui.png to match the current product.

  • User Guide: This document will be used by acceptance testers. Update to match the current version. In particular,

    • Clearly indicate which features are not implemented yet e.g. tag those features with a Coming in v2.0.
    • For those features already implemented, ensure their descriptions match the exact behavior of the product e.g. replace mockups with actual screenshots
  • Developer Guide: As before, update if necessary.

  • AboutUs page: Update to reflect current state of roles and responsibilities.

Submission: Must be included in the version tagged v1.3.

v1.3 Demo

  • Do a quick demo of the main features using the jar file. Objective: demonstrate that the jar file works.

v1.3 Testing (aka Practical Exam Dry Run)

See info in the panel below:

Relevant: [Admin Project Deliverables → Practical Exam - Dry Run ]

 

What: The v1.3 is subjected to a round of peer acceptance/system testing, also called the Practical Exam Dry Run as this round of testing will be similar to the graded Practical Exam that will be done at v1.4.

When, where: uses a 30 minute slot at the start of week 11 lecture

 

Objectives:

  • Evaluate your manual testing skills, product evaluation skills, effort estimation skills
  • Peer-evaluate your product design , implementation effort , documentation quality

When, where: Week 13 lecture

Grading:

  • Your performance in the practical exam will be considered for your final grade (under the QA category and under Implementation category, about 10 marks in total).
  • You will be graded based on your effectiveness as a tester (e.g., the percentage of the bugs you found, the nature of the bugs you found) and how far off your evaluation/estimates are from the evaluator consensus. Explanation: we understand that you have limited expertise in this area; hence, we penalize only if your inputs don't seem to be based on a sincere effort to test/evaluate.
  • The bugs found in your product by others will affect your v1.4 marks. You will be given a chance to reject false-positive bug reports.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During:

  1. Take note of your team to test. It will be given to you by the teaching team (distributed via IVLE gradebook).
  2. Download from IVLE all files submitted by the team (i.e. jar file, User Guide, Developer Guide, and Project Portfolio Pages) into an empty folder.
  3. [~40 minutes] Test the product and report bugs as described below:
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

  1. [~50 minutes] Evaluate the following aspects. Note down your evaluation in a hard copy (as a backup). Submit via TEAMMATES.

    • A. Cohesiveness of product features []: Do the features fit together and match the stated target user and the value proposition?

      • unable to judge: You are unable to judge this aspect for some reason.
      • low: One of these
        • target user is too general  i.e. wider than AB4
        • target user and/or value proposition not clear from the user guide
        • features don't seem to fit together for the most part
      • medium: Some features fit together but some don't.
      • high: All features fit together but the features are not very high value to the target user.
      • excellent: The target user is clearly defined (not too general) and almost all new features are of high-value to the target user. i.e. the product is very attractive to the target user.
    • B. Quality of user docs []: Evaluate based on the parts of the user guide written by the person, as reproduced in the project portfolio. Evaluate from an end-user perspective.

      • unable to judge: Less than 1 page worth of UG content written by the student.
      • low: Hard to understand, often inaccurate or missing important information.
      • medium: Needs some effort to understand; some information is missing.
      • high: Mostly easy to follow. Only a few areas need improvements.
      • excellent: Easy to follow and accurate. Just enough information, visuals, examples etc. (not too much either). Understandable to the target end user.
    • C. Quality of developer docs []: Evaluate based on the developer docs cited/reproduced in the respective project portfolio page. Evaluate from the perspective of a new developer trying to understand how the features are implemented.

      • unable to judge: One of these
        • less than 0.5 pages worth of content.
        • other problems in the document  e.g. looks like included wrong content.
      • low: One of these
        • Very small amount of content (i.e., 0.5 - 1 page).
        • Hardly any use to the reader (i.e., content doesn't make much sense or redundant).
        • Uses ad-hoc diagrams where UML diagrams could have been used instead.
        • Multiple notation errors in UML diagrams.
      • medium: Some diagrams, some descriptions, but does not help the reader that much  e.g. overly complicated diagrams.
      • high: Enough diagrams (at lest two kinds of UML diagrams used) and enough descriptions (about 2 pages worth) but explanations are not always easy to follow.
      • excellent: Easy to follow. Just enough information (not too much). Minimum repetition of content/diagrams. Good use of diagrams to complement text descriptions. Easy to understand diagrams with just enough details rather than very complicated diagrams that are hard to understand.
    • D. Depth of feature []: Evaluate the feature done by the student for difficulty, depth, and completeness. Note: examples given below assume that AB4 did not have the commands edit, undo, and redo.

      • unable to judge: You are unable to judge this aspect for some reason.
      • low : An easy feature  e.g. make the existing find command case insensitive.
      • medium : Moderately difficult feature, barely acceptable implementation  e.g. an edit command that requires the user to type all fields, even the ones that are not being edited.
      • high: One of the below
        • A moderately difficult feature but fully implemented  e.g. an edit command that allows editing any field.
        • A difficult feature with a reasonable implementation but some aspects are not covered  undo/redo command that only allows a single undo/redo.
      • excellent: A difficult feature, all reasonable aspects are fully implemented  undo/redo command that allows multiple undo/redo.
    • E. Amount of work []: Evaluate the amount of work, on a scale of 0 to 30.

      • Consider this PR (history command) as 5 units of effort which means this PR (undo/redo command) is about 15 points of effort. Given that 30 points matches an effort twice as that needed for the undo/redo feature (which was given as an example of an A grade project), we expect most students to be have efforts lower than 20.
      • Consider the main feature only. Exclude GUI inputs, but consider GUI outputs of the feature. Count all implementation/testing/documentation work as mentioned in that person's PPP. Also look at the actual code written by the person. We understand that it is not possible to know exactly which part of the code is for the main feature; make a best-guess judgement call based on the available info.
      • Do not give a high value just to be nice. If your estimate is wildly inaccurate, it means you are unable to estimate the effort required to implement a feature in a project that you are supposed to know well at this point. You will lose marks if that is the case.

Processing PE Bug Reports:

There will be a review period for you to respond to the bug reports you received.

Duration: The review period will start around 1 day after the PE (exact time to be announced) and will last until the following Wednesday midnight. However, you are recommended to finish this task ASAP, to minimize cutting into your exam preparation work.

Bug reviewing is recommended to be done as a team as some of the decisions need team consensus.

Instructions for Reviewing Bug Reports

  • First, don't freak out if there are lot of bug reports. Many can be duplicates and some can be false positives. In any case, we anticipate that all of these products will have some bugs and our penalty for bugs is not harsh. Furthermore, it depends on the severity of the bug. Some bug may not even be penalized.

  • Do not edit the subject or the description. Do not close bug reports. Your response (if any) should be added as a comment.

  • If the bug is reported multiple times, mark all copies EXCEPT one as duplicates using the duplicate tag (if the duplicates have different severity levels, you should keep the one with the highest severity). In addition, use this technique to indicate which issue they are duplicates of. Duplicates can be omitted from processing steps given below.

  • If a bug seems to be for a different product (i.e. wrongly assigned to your team), let us know (email prof).

  • Decide if it is a real bug and apply ONLY one of these labels.

Response Labels:

  • response.Accepted: You accept it as a bug.
  • response.Rejected: What tester treated as a bug is in fact the expected behavior. The penalty for rejecting a bug using an unjustifiable explanation is higher than the penalty if the same bug was accepted. You can reject bugs that you inherited from AB4.
  • response.CannotReproduce: You are unable to reproduce the behavior reported in the bug after multiple tries.
  • response.IssueUnclear: The issue description is not clear.
  • If applicable, decide the type of bug. Bugs without type- are considered type-FunctionalityBug by default (which are liable to a heavier penalty):

Bug Type Labels:

  • type-FunctionalityBug : the bug is a flaw in how the product works.
  • type-DocumentationBug : the bug is in the documentation.
  • If you disagree with the original severity assigned to the bug, you may change it to the correct level, in which case add a comment justifying the change. All such changes will be double-checked by the teaching team and unreasonable lowering of severity will be penalized extra.:

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Decide who should fix the bug. Use the Assignees field to assign the issue to that person(s). There is no need to actually fix the bug though. It's simply an indication/acceptance of responsibility. If there is no assignee, we will distribute the penalty for that bug (if any) among all team members.

  • Add an explanatory comment explaining your choice of labels and assignees.

Grading: Taking part in the PE dry run is strongly encouraged as it can affect your grade in the following ways.

  • If the product you are allocated to test in the Practical Exam (at v1.4) had a very low bug count, we will consider your performance in PE dry run as well when grading the PE.
  • PE dry run will help you practice for the actual PE.
  • Taking part in the PE dry run will earn you participation points.
  • There is no penalty for bugs reported in your product. Every bug you find is a win-win for you and the team whose product you are testing.

Objectives:

  • To train you to do manual testing, bug reporting, bug triaging, bug fixing, communicating with users/testers/developers, evaluating products etc.
  • To help you improve your product before the final submission.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During the session:

  1. Take note of your team to test. Distributed via IVLE gradebook and via email.
  2. Download the latest jar file from the team's GitHub page. Copy it to an empty folder.
  3. Confirm you are testing the allocated product by comparing the product UI with the UI screenshot sent via email.
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

After the session:

  • We'll transfer the relevant bug reports to your repo over the weekend. Once you have received the bug reports for your product, it is up to you to decide whether you will act on reported issues before the final submission v1.4. For some issues, the correct decision could be to reject or postpone to a version beyond v1.4.
  • You can post in the issue thread to communicate with the tester e.g. to ask for more info, etc. However, the tester is not obliged to respond.
    • 💡 Do not argue with the issue reporter to try to convince that person that your way is correct/better. If at all, you can gently explain the rationale for the current behavior but do not waste time getting involved in long arguments. If you think the suggestion/bug is unreasonable, just thank the reporter for their view and close the issue.



Project: mid-v1.4 [week 12]

Tweak as per peer-testing results, draft Project Portfolio Page, practice product demo.

Project Management:

  • Freeze features around this time. Ensure the current product have all the features you intend to release at v1.4. Adding major changes after this point is risky. The remaining time is better spent fixing problems discovered late or on fine-tuning the product.
  • Ensure the code attributed to you by RepoSense is correct, as reported in the Project Activity Dashboard

Relevant: [Admin Tools → Using RepoSense ]

 

In previous semesters we asked students to annotate all their code using special @@author tags so that we can extract each student's code for grading. This semester, we are trying out a new tool called RepoSense that is expected to reduce the need for such tagging, and also make it easier for you to see (and learn from) code written by others.

Figure: RepoSense Report Features

1. View the current status of code authorship data:

  • The report generated by the tool is available at Project Code Dashboard (BETA). The feature that is most relevant to you is the Code Panel (shown on the right side of the screenshot above). It shows the code attributed to a given author. You are welcome to play around with the other features (they are still under development and will not be used for grading this semester).
  • Click on your name to load the code attributed to you (based on Git blame/log data) onto the code panel on the right.
  • If the code shown roughly matches the code you wrote, all is fine and there is nothing for you to do.

2. If the code does not match:

  • Here are the possible reasons for the code shown not to match the code you wrote:

    • the git username in some of your commits does not match your GitHub username (perhaps you missed our instructions to set your Git username to match GitHub username earlier in the project, or GitHub did not honor your Git username for some reason)
    • the actual authorship does not match the authorship determined by git blame/log e.g., another student touched your code after you wrote it, and Git log attributed the code to that student instead
  • In those cases,

    • Install RepoSense (see the Getting Started section of the RepoSense User Guide)
    • Use the two methods described in the RepoSense User Guide section Configuring a Repo to Provide Additional Data to RepoSense to provide additional data to the authorship analysis to make it more accurate.
    • If you add a config.json file to your repo (as specified by one of the two methods),
      • Please use the template json file given in the module website so that your display name matches the name we expect it to be.
      • If your commits have multiple author names, specify all of them e.g., "authorNames": ["theMyth", "theLegend", "theGary"]
      • Update the line config.json in the .gitignore file of your repo as /config.json so that it ignores the config.json produced by the app but not the _reposense/config.json.
    • If you add @@author annotations, please follow the guidelines below:

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

  • After you are satisfied with the new results (i.e., results produced by running RepoSense locally), push the config.json file you added and/or the annotated code to your repo. We'll use that information the next time we run RepoSense (we run it at least once a week).
  • If you choose to annotate code, please annotate code chunks not smaller than a method. We do not grade code snippets smaller than a method.
  • If you encounter any problem when doing the above or if you have questions, please post in the forum.

We recommend you ensure your code is RepoSense-compatible by v1.3

Product:

  • Consider increasing code coverage by adding more tests if it is lower than the level you would like it to be. Take note of our expectation on test code.
  • After you have sufficient code coverage, fix remaining code quality problems and bring up the quality to your target level.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.

Relevant: [Admin Project Assessment → Code Quality Tips ]

 
  • Ensure your code has at least some evidence of these (see here for more info)

    • logging
    • exceptions
    • assertions
    • defensive coding
  • Ensure there are no coding standard violations  e.g. all boolean variables/methods sounds like booleans. Checkstyle can prevent only some coding standard violations; others need to be checked manually.

  • Ensure SLAP is applied at a reasonable level. Long methods or deeply-nested code are symptoms of low-SLAP may be counted against your code quality.

  • Reduce code duplications  i.e. if there multiple blocks of code that vary only in minor ways, try to extract out similarities into one place, especially in test code.

  • In addition, try to apply as many of the code quality guidelines covered in the module as much as you can.

 

Code Quality

Introduction

Basic

Can explain the importance of code quality

Always code as if the person who ends up maintaining your code will be a violent psychopath who knows where you live. -- Martin Golding

Production code needs to be of high quality . Given how the world is becoming increasingly dependent of software, poor quality code is something we cannot afford to tolerate.

Code being used in an actual product with actual users

Guideline: Maximise Readability

Introduction

Can explain the importance of readability

Programs should be written and polished until they acquire publication quality. --Niklaus Wirth

Among various dimensions of code quality, such as run-time efficiency, security, and robustness, one of the most important is understandability. This is because in any non-trivial software project, code needs to be read, understood, and modified by other developers later on. Even if we do not intend to pass the code to someone else, code quality is still important because we all become 'strangers' to our own code someday.

The two code samples given below achieve the same functionality, but one is easier to read.

     

Bad

int subsidy() {
    int subsidy;
    if (!age) {
        if (!sub) {
            if (!notFullTime) {
                subsidy = 500;
            } else {
                subsidy = 250;
            }
        } else {
            subsidy = 250;
        }
    } else {
        subsidy = -1;
    }
    return subsidy;
}

  

Good

int calculateSubsidy() {
    int subsidy;
    if (isSenior) {
        subsidy = REJECT_SENIOR;
    } else if (isAlreadySubsidised) {
        subsidy = SUBSIDISED_SUBSIDY;
    } else if (isPartTime) {
        subsidy = FULLTIME_SUBSIDY * RATIO;
    } else {
        subsidy = FULLTIME_SUBSIDY;
    }
    return subsidy;
}

     

Bad

def calculate_subs():
    if not age:
        if not sub:
            if not not_fulltime:
                subsidy = 500
            else:
                subsidy = 250
        else:
            subsidy = 250
    else:
        subsidy = -1
    return subsidy

  

Good

def calculate_subsidy():
    if is_senior:
        return REJECT_SENIOR
    elif is_already_subsidised:
        return SUBSIDISED_SUBSIDY
    elif is_parttime:
        return FULLTIME_SUBSIDY * RATIO
    else:
        return FULLTIME_SUBSIDY

Basic

Avoid Long Methods

Can improve code quality using technique: avoid long methods

Be wary when a method is longer than the computer screen, and take corrective action when it goes beyond 30 LOC (lines of code). The bigger the haystack, the harder it is to find a needle.

Avoid Deep Nesting

Can improve code quality using technique: avoid deep nesting

If you need more than 3 levels of indentation, you're screwed anyway, and should fix your program. --Linux 1.3.53 CodingStyle

In particular, avoid arrowhead style code.

Example:

Avoid Complicated Expressions

Can improve code quality using technique: avoid complicated expressions

Avoid complicated expressions, especially those having many negations and nested parentheses. If you must evaluate complicated expressions, have it done in steps (i.e. calculate some intermediate values first and use them to calculate the final value).

Example:

Bad

return ((length < MAX_LENGTH) || (previousSize != length)) && (typeCode == URGENT);

Good


boolean isWithinSizeLimit = length < MAX_LENGTH;
boolean isSameSize = previousSize != length;
boolean isValidCode = isWithinSizeLimit || isSameSize;

boolean isUrgent = typeCode == URGENT;

return isValidCode && isUrgent;

Example:

Bad

return ((length < MAX_LENGTH) or (previous_size != length)) and (type_code == URGENT)

Good

is_within_size_limit = length < MAX_LENGTH
is_same_size = previous_size != length
is_valid_code = is_within_size_limit or is_same_size

is_urgent = type_code == URGENT

return is_valid_code and is_urgent

The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. -- Edsger Dijkstra

Avoid Magic Numbers

Can improve code quality using technique: avoid magic numbers

When the code has a number that does not explain the meaning of the number, we call that a magic number (as in “the number appears as if by magic”). Using a named constant makes the code easier to understand because the name tells us more about the meaning of the number.

Example:

     

Bad

return 3.14236;
...
return 9;

  

Good

static final double PI = 3.14236;
static final int MAX_SIZE = 10;
...
return PI;
...
return MAX_SIZE-1;

Note: Python does not have a way to make a variable a constant. However, you can use a normal variable with an ALL_CAPS name to simulate a constant.

     

Bad

return 3.14236
...
return 9

  

Good

PI = 3.14236
MAX_SIZE = 10
...
return PI
...
return MAX_SIZE-1

Similarly, we can have ‘magic’ values of other data types.

Bad

"Error 1432"  // A magic string!

Make the Code Obvious

Can improve code quality using technique: make the code obvious

Make the code as explicit as possible, even if the language syntax allows them to be implicit. Here are some examples:

  • [Java] Use explicit type conversion instead of implicit type conversion.
  • [Java, Python] Use parentheses/braces to show grouping even when they can be skipped.
  • [Java, Python] Use enumerations when a certain variable can take only a small number of finite values. For example, instead of declaring the variable 'state' as an integer and using values 0,1,2 to denote the states 'starting', 'enabled', and 'disabled' respectively, declare 'state' as type SystemState and define an enumeration SystemState that has values 'STARTING', 'ENABLED', and 'DISABLED'.

Intermediate

Structure Code Logically

Can improve code quality using technique: structure code logically

Lay out the code so that it adheres to the logical structure. The code should read like a story. Just like we use section breaks, chapters and paragraphs to organize a story, use classes, methods, indentation and line spacing in your code to group related segments of the code. For example, you can use blank lines to group related statements together. Sometimes, the correctness of your code does not depend on the order in which you perform certain intermediary steps. Nevertheless, this order may affect the clarity of the story you are trying to tell. Choose the order that makes the story most readable.

Do Not 'Trip Up' Reader

Can improve code quality using technique: do not 'trip up' reader

Avoid things that would make the reader go ‘huh?’, such as,

  • unused parameters in the method signature
  • similar things look different
  • different things that look similar
  • multiple statements in the same line
  • data flow anomalies such as, pre-assigning values to variables and modifying it without any use of the pre-assigned value

Practice KISSing

Can improve code quality using technique: practice kissing

As the old adage goes, "keep it simple, stupid” (KISS). Do not try to write ‘clever’ code. For example, do not dismiss the brute-force yet simple solution in favor of a complicated one because of some ‘supposed benefits’ such as 'better reusability' unless you have a strong justification.

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. --Brian W. Kernighan

Programs must be written for people to read, and only incidentally for machines to execute. --Abelson and Sussman

Avoid Premature Optimizations

Can improve code quality using technique: avoid premature optimizations

Optimizing code prematurely has several drawbacks:

  • We may not know which parts are the real performance bottlenecks. This is especially the case when the code undergoes transformations (e.g. compiling, minifying, transpiling, etc.) before it becomes an executable. Ideally, you should use a profiler tool to identify the actual bottlenecks of the code first, and optimize only those parts.
  • Optimizing can complicate the code, affecting correctness and understandability
  • Hand-optimized code can be harder for the compiler to optimize (the simpler the code, the easier for the compiler to optimize it). In many cases a compiler can do a better job of optimizing the runtime code if you don't get in the way by trying to hand-optimize the source code.

A popular saying in the industry is make it work, make it right, make it fast which means in most cases getting the code to perform correctly should take priority over optimizing it. If the code doesn't work correctly, it has no value on matter how fast/efficient it it.

Premature optimization is the root of all evil in programming. --Donald Knuth

Note that there are cases where optimizing takes priority over other things e.g. when writing code for resource-constrained environments. This guideline simply a caution that you should optimize only when it is really needed.

SLAP Hard

Can improve code quality using technique: SLAP hard

Avoid varying the level of abstraction within a code fragment. Note: The Productive Programmer (by Neal Ford) calls this the SLAP principle i.e. Single Level of Abstraction Per method.

Example:

Bad

readData();
salary = basic*rise+1000;
tax = (taxable?salary*0.07:0);
displayResult();

Good

readData();
processData();
displayResult();
 

Design → Design Fundamentals → Abstraction →

What

Abstraction is a technique for dealing with complexity. It works by establishing a level of complexity we are interested in, and suppressing the more complex details below that level.

The guiding principle of abstraction is that only details that are relevant to the current perspective or the task at hand needs to be considered. As most programs are written to solve complex problems involving large amounts of intricate details, it is impossible to deal with all these details at the same time. That is where abstraction can help.

Ignoring lower level data items and thinking in terms of bigger entities is called data abstraction.

Within a certain software component, we might deal with a user data type, while ignoring the details contained in the user data item such as name, and date of birth. These details have been ‘abstracted away’ as they do not affect the task of that software component.

Control abstraction abstracts away details of the actual control flow to focus on tasks at a simplified level.

print(“Hello”) is an abstraction of the actual output mechanism within the computer.

Abstraction can be applied repeatedly to obtain progressively higher levels of abstractions.

An example of different levels of data abstraction: a File is a data item that is at a higher level than an array and an array is at a higher level than a bit.

An example of different levels of control abstraction: execute(Game) is at a higher level than print(Char) which is at a higher than an Assembly language instruction MOV.

Abstraction is a general concept that is not limited to just data or control abstractions.

Some more general examples of abstraction:

  • An OOP class is an abstraction over related data and behaviors.
  • An architecture is a higher-level abstraction of the design of a software.
  • Models (e.g., UML models) are abstractions of some aspect of reality.

Advanced

Make the Happy Path Prominent

Can improve code quality using technique: make the happy path prominent

The happy path (i.e. the execution path taken when everything goes well) should be clear and prominent in your code. Restructure the code to make the happy path unindented as much as possible. It is the ‘unusual’ cases that should be indented. Someone reading the code should not get distracted by alternative paths taken when error conditions happen. One technique that could help in this regard is the use of guard clauses.

Example:

Bad

if (!isUnusualCase) {  //detecting an unusual condition
    if (!isErrorCase) {
        start();    //main path
        process();
        cleanup();
        exit();
    } else {
        handleError();
    }
} else {
    handleUnusualCase(); //handling that unusual condition
}

In the code above,

  • Unusual condition detection is separated from their handling.
  • Main path is nested deeply.

Good

if (isUnusualCase) { //Guard Clause
    handleUnusualCase();
    return;
}

if (isErrorCase) { //Guard Clause
    handleError();
    return;
}

start();
process();
cleanup();
exit();

In contrast, the above code

  • deals with unusual conditions as soon as they are detected so that the reader doesn't have to remember them for long.
  • keeps the main path un-indented.

Guideline: Follow a Standard

Introduction

Can explain the need for following a standard

One essential way to improve code quality is to follow a consistent style. That is why software engineers follow a strict coding standard (aka style guide).

The aim of a coding standard is to make the entire code base look like it was written by one person. A coding standard is usually specific to a programming language and specifies guidelines such as the location of opening and closing braces, indentation styles and naming styles (e.g. whether to use Hungarian style, Pascal casing, Camel casing, etc.). It is important that the whole team/company use the same coding standard and that standard is not generally inconsistent with typical industry practices. If a company's coding standards is very different from what is used typically in the industry, new recruits will take longer to get used to the company's coding style.

💡 IDEs can help to enforce some parts of a coding standard e.g. indentation rules.

What is the recommended approach regarding coding standards?

c

What is the aim of using a coding standard? How does it help?

Basic

Can follow simple mechanical style rules

Learn basic guidelines of the Java coding standard (by OSS-Generic)

Consider the code given below:

import java.util.*;

public class Task {
    public static final String descriptionPrefix = "description: ";
    private String description;
    private boolean important;
    List<String> pastDescription = new ArrayList<>(); // a list of past descriptions

    public Task(String d) {
      this.description = d;
      if (!d.isEmpty())
          this.important = true;
    }

    public String getAsXML() { return "<task>"+description+"</task>"; }

    /**
     * Print the description as a string.
     */
    public void printingDescription(){ System.out.println(this); }

    @Override
    public String toString() { return descriptionPrefix + description; }
}

In what ways the code violate the basic guidelines (i.e., those marked with one ⭐️) of the OSS-Generic Java Coding Standard given here?

Here are three:

  • descriptionPrefix is a constant and should be named DESCRIPTION_PREFIX
  • method name printingDescription() should be named as printDescription()
  • boolean variable important should be named to sound boolean e.g., isImportant

There are many more.

Intermediate

Can follow intermediate style rules

Go through the provided Java coding standard and learn the intermediate style rules.

According to the given Java coding standard, which one of these is not a good name?

b

Explanation: checkWeight is an action. Naming variables as actions makes the code harder to follow. isWeightValid may be a better name.

Repeat the exercise in the panel below but also find violations of intermediate level guidelines.

Consider the code given below:

import java.util.*;

public class Task {
    public static final String descriptionPrefix = "description: ";
    private String description;
    private boolean important;
    List<String> pastDescription = new ArrayList<>(); // a list of past descriptions

    public Task(String d) {
      this.description = d;
      if (!d.isEmpty())
          this.important = true;
    }

    public String getAsXML() { return "<task>"+description+"</task>"; }

    /**
     * Print the description as a string.
     */
    public void printingDescription(){ System.out.println(this); }

    @Override
    public String toString() { return descriptionPrefix + description; }
}

In what ways the code violate the basic guidelines (i.e., those marked with one ⭐️) of the OSS-Generic Java Coding Standard given here?

Here are three:

  • descriptionPrefix is a constant and should be named DESCRIPTION_PREFIX
  • method name printingDescription() should be named as printDescription()
  • boolean variable important should be named to sound boolean e.g., isImportant

There are many more.

Here's one you are more likely to miss:

  • * Print the description as a string.* Prints the description as a string.

There are more.

Guideline: Name Well

Introduction

Can explain the need for good names in code

Proper naming improves the readability. It also reduces bugs caused by ambiguities regarding the intent of a variable or a method.

There are only two hard things in Computer Science: cache invalidation and naming things. -- Phil Karlton

Basic

Use Nouns for Things and Verbs for Actions

Can improve code quality using technique: use nouns for things and verbs for actions

Every system is built from a domain-specific language designed by the programmers to describe that system. Functions are the verbs of that language, and classes are the nouns. ― Robert C. Martin, Clean Code: A Handbook of Agile Software Craftsmanship

Use nouns for classes/variables and verbs for methods/functions.

Examples:

Name for a Bad Good
Class CheckLimit LimitChecker
method result() calculate()

Distinguish clearly between single-valued and multivalued variables.

Examples:

Good

Person student;
ArrayList<Person> students;

Good

student = Person('Jim')
students = [Person('Jim'), Person('Alice')]

Use Standard Words

Can improve code quality using technique: use standard words

Use correct spelling in names. Avoid 'texting-style' spelling. Avoid foreign language words, slang, and names that are only meaningful within specific contexts/times e.g. terms from private jokes, a TV show currently popular in your country

Intermediate

Use Name to Explain

Can improve code quality using technique: use name to explain

A name is not just for differentiation; it should explain the named entity to the reader accurately and at a sufficient level of detail.

Examples:

Bad Good
processInput() (what 'process'?) removeWhiteSpaceFromInput()
flag isValidInput
temp

If the name has multiple words, they should be in a sensible order.

Examples:

Bad Good
bySizeOrder() orderBySize()

Imagine going to the doctor's and saying "My eye1 is swollen"! Don’t use numbers or case to distinguish names.

Examples:

Bad Bad Good
value1, value2 value, Value originalValue, finalValue

Not Too Long, Not Too Short

Can improve code quality using technique: not too long, not too short

While it is preferable not to have lengthy names, names that are 'too short' are even worse. If you must abbreviate or use acronyms, do it consistently. Explain their full meaning at an obvious location.

Avoid Misleading Names

Can improve code quality using technique: avoid misleading names

Related things should be named similarly, while unrelated things should NOT.

Example: Consider these variables

  • colorBlack : hex value for color black
  • colorWhite : hex value for color white
  • colorBlue : number of times blue is used
  • hexForRed : : hex value for color red

This is misleading because colorBlue is named similar to colorWhite and colorBlack but has a different purpose while hexForRed is named differently but has very similar purpose to the first two variables. The following is better:

  • hexForBlack hexForWhite hexForRed
  • blueColorCount

Avoid misleading or ambiguous names (e.g. those with multiple meanings), similar sounding names, hard-to-pronounce ones (e.g. avoid ambiguities like "is that a lowercase L, capital I or number 1?", or "is that number 0 or letter O?"), almost similar names.

Examples:

Bad Good Reason
phase0 phaseZero Is that zero or letter O?
rwrLgtDirn rowerLegitDirection Hard to pronounce
right left wrong rightDirection leftDirection wrongResponse right is for 'correct' or 'opposite of 'left'?
redBooks readBooks redColorBooks booksRead red and read (past tense) sounds the same
FiletMignon egg If the requirement is just a name of a food, egg is a much easier to type/say choice than FiletMignon

Guideline: Avoid Unsafe Shortcuts

Introduction

Can explain the need for avoiding error-prone shortcuts

It is safer to use language constructs in the way they are meant to be used, even if the language allows shortcuts. Some such coding practices are common sources of bugs. Know them and avoid them.

Basic

Use the Default Branch

Can improve code quality using technique: use the default branch

Always include a default branch in case statements.

Furthermore, use it for the intended default action and not just to execute the last option. If there is no default action, you can use the 'default' branch to detect errors (i.e. if execution reached the default branch, throw an exception). This also applies to the final else of an if-else construct. That is, the final else should mean 'everything else', and not the final option. Do not use else when an if condition can be explicitly specified, unless there is absolutely no other possibility.

Bad

if (red) print "red";
else print "blue";

Good

if (red) print "red";
else if (blue) print "blue";
else error("incorrect input");

Don't Recycle Variables or Parameters

Can improve code quality using technique: don't recycle variables or parameters

  • Use one variable for one purpose. Do not reuse a variable for a different purpose other than its intended one, just because the data type is the same.
  • Do not reuse formal parameters as local variables inside the method.

Bad

double computeRectangleArea(double length, double width) {
    length = length * width;
    return length;
}

Good

double computeRectangleArea(double length, double width) {
    double area;
    area = length * width;
    return area;
}

Avoid Empty Catch Blocks

Can improve code quality using technique: avoid empty catch blocks

Never write an empty catch statement. At least give a comment to explain why the catch block is left empty.

Delete Dead Code

Can improve code quality using technique: delete dead code

We all feel reluctant to delete code we have painstakingly written, even if we have no use for that code any more ("I spent a lot of time writing that code; what if we need it again?"). Consider all code as baggage you have to carry; get rid of unused code the moment it becomes redundant. If you need that code again, simply recover it from the revision control tool you are using. Deleting code you wrote previously is a sign that you are improving.

Intermediate

Minimise Scope of Variables

Can improve code quality using technique: minimise scope of variables

Minimize global variables. Global variables may be the most convenient way to pass information around, but they do create implicit links between code segments that use the global variable. Avoid them as much as possible.

Define variables in the least possible scope. For example, if the variable is used only within the if block of the conditional statement, it should be declared inside that if block.

The most powerful technique for minimizing the scope of a local variable is to declare it where it is first used. -- Effective Java, by Joshua Bloch

Resources:

Minimise Code Duplication

Can improve code quality using technique: minimise code duplication

Code duplication, especially when you copy-paste-modify code, often indicates a poor quality implementation. While it may not be possible to have zero duplication, always think twice before duplicating code; most often there is a better alternative.

This guideline is closely related to the DRY Principle.

Supplmentary → Principles →

DRY Principle

DRY (Don't Repeat Yourself) Principle: Every piece of knowledge must have a single, unambiguous, authoritative representation within a system The Pragmatic Programmer, by Andy Hunt and Dave Thomas

This principle guards against duplication of information.

The functionality implemented twice is a violation of the DRY principle even if the two implementations are different.

The value a system-wide timeout being defined in multiple places is a violation of DRY.

Guideline: Comment Minimally, but Sufficiently

Introduction

Can explain the need for commenting minimally but sufficiently

Good code is its own best documentation. As you’re about to add a comment, ask yourself, ‘How can I improve the code so that this comment isn’t needed?’ Improve the code and then document it to make it even clearer. --Steve McConnell, Author of Clean Code

Some think commenting heavily increases the 'code quality'. This is not so. Avoid writing comments to explain bad code. Improve the code to make it self-explanatory.

Basic

Do Not Repeat the Obvious

Can improve code quality using technique: do not repeat the obvious

If the code is self-explanatory, refrain from repeating the description in a comment just for the sake of 'good documentation'.

Bad

// increment x
x++;

//trim the input
trimInput();

Write to the Reader

Can improve code quality using technique: write to the reader

Do not write comments as if they are private notes to self. Instead, write them well enough to be understood by another programmer. One type of comments that is almost always useful is the header comment that you write for a class or an operation to explain its purpose.

Examples:

Bad Reason: this comment will only make sense to the person who wrote it

// a quick trim function used to fix bug I detected overnight
void trimInput(){
    ....
}

Good

/** Trims the input of leading and trailing spaces */
void trimInput(){
    ....
}

Bad Reason: this comment will only make sense to the person who wrote it

# a quick trim function used to fix bug I detected overnight
def trim_input():
    ...

Good

def trim_input():
"""Trim the input of leading and trailing spaces"""
    ...

Intermediate

Explain WHAT and WHY, not HOW

Can improve code quality using technique: explain what and why, not how

Comments should explain what and why aspect of the code, rather than the how aspect.

What : The specification of what the code supposed to do. The reader can compare such comments to the implementation to verify if the implementation is correct

Example: This method is possibly buggy because the implementation does not seem to match the comment. In this case the comment could help the reader to detect the bug.

/** Removes all spaces from the {@code input} */
void compact(String input){
    input.trim();
}

Why : The rationale for the current implementation.

Example: Without this comment, the reader will not know the reason for calling this method.

// Remove spaces to comply with IE23.5 formatting rules
compact(input);

How : The explanation for how the code works. This should already be apparent from the code, if the code is self-explanatory. Adding comments to explain the same thing is redundant.

Example:

Bad Reason: Comment explains how the code works.

// return true if both left end and right end are correct or the size has not incremented
return (left && right) || (input.size() == size);

Good Reason: Code refactored to be self-explanatory. Comment no longer needed.


boolean isSameSize = (input.size() == size) ;
return (isLeftEndCorrect && isRightEndCorrect) || isSameSize;

null

Documentation:

  • Update documentation to match the product.

  • Create the first version of your Project Portfolio Page (PPP). Reason: Each member needs to create a PPP to describe your contribution to the project. Creating a PPP takes a significant effort; it is too risky to leave it to the last week of the project.

Relevant: [Admin Project → Deliverables → Project Portfolio Page ]

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Demo:

  • Do a product demo to serve as a rehearsal for the final project demo at v1.4
    • Follow final demo instructions as much as possible.
    • Cover all features, not just the ones added in the recent iteration.
    • Try to make it a 'well prepared' demo i.e., know in advance exactly what you'll do in the demo.
 
  • Duration: Strictly 18 minutes for a 5-person team and 15 minutes for a 4-person team. Exceeding this limit will be penalized. Any set up time will be taken out of your allocated time.

  • Target audience: Assume you are giving a demo to a higher-level manager of your company, to brief him/her on the current capabilities of the product. This is the first time they are seeing the new product you developed but they are familiar with the AddressBook-level4 (AB4) product. The actual audience are the evaluators (the team supervisor and another tutor).

  • Scope:

    • Each person should demo the enhancements they added. However, it's ok for one member to do all the typing.
    • Subjected to the constraint mentioned in the previous point, as far as possible, organize the demo to present a cohesive picture of the product as a whole, presented in a logical order.  Remember to explain the profile of the target user profile and value proposition early in the demo.
    • It is recommended you showcase how the feature improves the user’s life rather than simply describe each feature.
    • No need to cover design/implementation details as the manager is not interested in those details.
    • Mention features you inherited from AB4 only if they are needed to explain your new features.  Reason: existing features will not earn you marks, and the audience is already familiar with AB4 features.
    • Each person should demo their main feature only. You are free to 'tie-in' other work under the main feature, but anything that cannot be tied-in to the main feature should be omitted from the demo (as those are not graded, showing them to the evaluators will only make the evaluation harder). For similar reasons, do not demo GUI inputs (but you can demo GUI outputs).
    • We recommend each person to start by giving an overview of the main feature before going into the details. That the evaluator informed of your main feature from the very start.
  • Structure:

    • Demo the product using the same executable you submitted, on your own laptop, using the TV.
    • It can be a sitting down demo: You'll be demonstrating the features using the TV while sitting down. But you may stand around the TV if you prefer that way.
    • It will be an uninterrupted demo: The audience members will not interrupt you during the demo. That means you should finish within the given time.
    • The demo should use a sufficient amount of realistic demo data.  e.g at least 20 contacts. Trying to demo a product using just 1-2 sample data creates a bad impression.
    • Dress code : The level of formality is up to you, but it is recommended that the whole team dress at the same level.
  • Optimizing the time:

    • Spend as much time as possible on demonstrating the actual product. Not recommended to use slides (if you do, use them sparingly) or videos or lengthy narrations.
      Avoid skits, re-enactments, dramatizations etc. This is not a sales pitch or an informercial. While you need to show how a user use the product to get value, but you don’t need to act like an imaginary user. For example, [Instead of this] Jim get’s a call from boss. "Ring ring", "hello", "oh hi Jim, can we postpone the meeting?" "Sure". Jim hang up and curses the boss under his breath. Now he starts typing ..etc. [do this] If Jim needs to postpone the meeting, he can type … It’s not that dramatization is bad or we don’t like it. We simply don’t have enough time for it.
      Note that CS2101 demo requirements may differ. Different context → Different requirements.
    • Rehearse the steps well and ensure you can do a smooth demo. Poor quality demos can affect your grade.
    • Don’t waste time repeating things the target audience already knows. e.g. no need to say things like "We are students from NUS, SoC".
    • Plan the demo to be in sync with the impression you want to create. For example, if you are trying to convince that the product is easy to use, show the easiest way to perform a task before you show the full command with all the bells and whistles.
  • Special circumstances:

    • If your main feature was not merged on time: inform the tutor and get permission to show the unmerged feature using your own version of the code. Obviously, unmerged features earn much less marks than a merged equivalent but something is better than nothing.
    • If you have no user visible features to show, you can still contribute to the demo by giving an overvie of the product (at the start) and/or giving a wrap of of the product (at the end).
    • If you are unable to come to the demo due to a valid reason, you can ask a team member to demo your feature. Remember to submit the evidence of your excuse e.g., MC to prof. The demo is part of module assessment and absence without a valid reason will cause you to lose marks.


Project: v1.4 [week 13]

Final tweaks to docs/product, release product, demo product, evaluate peer projects.

Summary of submissions:

Team/Individual Item Name format Upload to
Source code tag as v1.4 GitHub
Jar file [team][product name].jar
e.g. [W09-1][ContactsPlus].jar
IVLE
User Guide [TEAM_ID][product Name]UserGuide.pdf
e.g.[W09-1][Contacts Plus]UserGuide.pdf
IVLE
Developer Guide [TEAM_ID][product Name]DeveloperGuide.pdf
e.g. [W09-1][Contacts Plus]DeveloperGuide.pdf
IVLE
Product Website README page, Ui.png, AboutUs page github.io
Project Portfolio Page [TEAM_ID][Your Name]Portfolio.pdf
e.g.[W09-1][John Doe]Portfolio.pdf
html version of PPP page on the product website
IVLE

github.io

Deadline for all v1.4 submissions is Week 13 Monday 23.59 unless stated otherwise.

  • Penalty for late submission: -1 mark for each hour delayed, up to 3 hours. Even a one-second delay is considered late, irrespective of the reason. Penalty for delays beyond 3 hours are determined on a case by case basis.
    • For submissions done via IVLE, the submission time is the timestamp shown by IVLE.
    • When determining the late submission penalty, we take the latest submission even if the same exact file was submitted earlier. Do not submit the same file multiple times if you want to avoid unnecessary late submission penalties.
  • The whole team is penalized for problems in team submissions. Only the respective student is penalized for problems in individual submissions.
  • Please follow submission instructions closely. Any non-compliance will be penalized. e.g. wrong file name, team member photos not suitable, etc.
  • For pdf submissions, ensure the file is usable and hyperlinks in the file are correct. Problems in documents are considered bugs too  e.g. broken links, outdated diagrams/instructions etc..
  • Do not update the repo during the 14 days after the deadline. Get our permission first if you need to update the repo during that freeze period. You can continue to evolve your repo after that.

Grading:

Described in [Admin: Project: Assessment]

v1.4 Product

Relevant: [Admin Project → Deliverables → Executable ]

 
  • The product should be delivered as an executable jar file.
  • Ideally, the product delivered at v1.4 should be a releasable product. However, in the interest of lowering your workload, we do not penalize if the product is not releasable, as long as the product is acceptance testable.

Submission: See summary of submissions above

v1.4 Source Code

Relevant: [Admin Project → Deliverables → Source Code ]

 
  • The source code should match the executable, and should include the revision history of the source code, as a Git repo.

Submission: Push the code to GitHub and tag with the version number. Source code (please ensure the code reported by RepoSense as yours is correct; any updates to RepoSense config files or @@author annotations after the deadline will be considered a later submission). Note that the quality of the code attributed to you accounts for a significant component of your final score, graded individually.

v1.4 User Guide

Relevant: [Admin Project → Deliverables → User Guide ]

 
  • The User Guide (UG) of the product should match the proposed v2.0 of the product and in sync with the current version of the product.
  • Features not implemented yet should be clearly marked as Coming in v2.0
  • Ensure the UG matches the product precisely, as it will be used by peer testers (and any inaccuracy in the content will be considered bugs).

Submission: Convert the pdf (AB4 dev guide has some instructions on converting project docs to pdf) and upload to IVLE. See summary of submissions above for the file name format.

v1.4 Developer Guide

Relevant: [Admin Project → Deliverables → Developer Guide ]

 
  • The Developer Guide (DG) of the product should match the proposed v2.0 of the product and should be in sync with the current version of the product.
  • The appendix named Instructions for Manual Testing of the Developer Guide should include testing instructions to cover the main enhancement of each team member. There is no need to add testing instructions for existing features if you did not touch them.
    💡 What to include in the appendix Instructions for Manual Testing? This appendix is meant to give some guidance to the tester to chart a path through the features, and provide some important test inputs the tester can copy-paste into the app. There is no need to give a long list of test cases including all possible variations. It is upto the tester to come up with those variations. However, if the instructions are inaccurate or deliberately misses/mis-states information to make testing harder  i.e. annoys the tester, the tester can report it as a bug  (because flaws in developer docs are considered as bugs).
  • Ensure the parted DG parts included in PPPs match the product precisely, as PPPs will be used by peer evaluators (and any inaccuracy in the content will be considered bugs).

Submission: Similar to UG

v1.4 Project Portfolio Page (PPP)

Relevant: [Admin Project → Deliverables → Project Portfolio Page ]

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Submission: Similar to UG

v1.4 Product Website

Relevant: [Admin Project → Deliverables → Product Website ]

 
  • Include an updated version of the online UG and DG that match v1.4 executable
  • README : Ensure the Ui.png matches the current product
  • AboutUs : Ensure the following:
    • Use a suitable profile photo
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

  • Contains a link to each person's Project Portfolio page
  • Team member names match full names used by IVLE

Submission: Push the code to GitHub. Ensure the website is auto-published.

v1.4 Demo

Relevant: [Admin Project → Deliverables → Demo ]

 
  • Duration: Strictly 18 minutes for a 5-person team and 15 minutes for a 4-person team. Exceeding this limit will be penalized. Any set up time will be taken out of your allocated time.

  • Target audience: Assume you are giving a demo to a higher-level manager of your company, to brief him/her on the current capabilities of the product. This is the first time they are seeing the new product you developed but they are familiar with the AddressBook-level4 (AB4) product. The actual audience are the evaluators (the team supervisor and another tutor).

  • Scope:

    • Each person should demo the enhancements they added. However, it's ok for one member to do all the typing.
    • Subjected to the constraint mentioned in the previous point, as far as possible, organize the demo to present a cohesive picture of the product as a whole, presented in a logical order.  Remember to explain the profile of the target user profile and value proposition early in the demo.
    • It is recommended you showcase how the feature improves the user’s life rather than simply describe each feature.
    • No need to cover design/implementation details as the manager is not interested in those details.
    • Mention features you inherited from AB4 only if they are needed to explain your new features.  Reason: existing features will not earn you marks, and the audience is already familiar with AB4 features.
    • Each person should demo their main feature only. You are free to 'tie-in' other work under the main feature, but anything that cannot be tied-in to the main feature should be omitted from the demo (as those are not graded, showing them to the evaluators will only make the evaluation harder). For similar reasons, do not demo GUI inputs (but you can demo GUI outputs).
    • We recommend each person to start by giving an overview of the main feature before going into the details. That the evaluator informed of your main feature from the very start.
  • Structure:

    • Demo the product using the same executable you submitted, on your own laptop, using the TV.
    • It can be a sitting down demo: You'll be demonstrating the features using the TV while sitting down. But you may stand around the TV if you prefer that way.
    • It will be an uninterrupted demo: The audience members will not interrupt you during the demo. That means you should finish within the given time.
    • The demo should use a sufficient amount of realistic demo data.  e.g at least 20 contacts. Trying to demo a product using just 1-2 sample data creates a bad impression.
    • Dress code : The level of formality is up to you, but it is recommended that the whole team dress at the same level.
  • Optimizing the time:

    • Spend as much time as possible on demonstrating the actual product. Not recommended to use slides (if you do, use them sparingly) or videos or lengthy narrations.
      Avoid skits, re-enactments, dramatizations etc. This is not a sales pitch or an informercial. While you need to show how a user use the product to get value, but you don’t need to act like an imaginary user. For example, [Instead of this] Jim get’s a call from boss. "Ring ring", "hello", "oh hi Jim, can we postpone the meeting?" "Sure". Jim hang up and curses the boss under his breath. Now he starts typing ..etc. [do this] If Jim needs to postpone the meeting, he can type … It’s not that dramatization is bad or we don’t like it. We simply don’t have enough time for it.
      Note that CS2101 demo requirements may differ. Different context → Different requirements.
    • Rehearse the steps well and ensure you can do a smooth demo. Poor quality demos can affect your grade.
    • Don’t waste time repeating things the target audience already knows. e.g. no need to say things like "We are students from NUS, SoC".
    • Plan the demo to be in sync with the impression you want to create. For example, if you are trying to convince that the product is easy to use, show the easiest way to perform a task before you show the full command with all the bells and whistles.
  • Special circumstances:

    • If your main feature was not merged on time: inform the tutor and get permission to show the unmerged feature using your own version of the code. Obviously, unmerged features earn much less marks than a merged equivalent but something is better than nothing.
    • If you have no user visible features to show, you can still contribute to the demo by giving an overvie of the product (at the start) and/or giving a wrap of of the product (at the end).
    • If you are unable to come to the demo due to a valid reason, you can ask a team member to demo your feature. Remember to submit the evidence of your excuse e.g., MC to prof. The demo is part of module assessment and absence without a valid reason will cause you to lose marks.

  • Venue: Same as the tutorial venue unless informed otherwise.
  • Schedule: Your demo timing is same as your tutorial time in week 13.
    • Please arrive on time and remain outside the venue until called in.
    • There is an automatic penalty if you are not ready to start on time.
    • You should bring your own adapter if the display adapters available in your tutorial venue don't work for you.

v1.4 Practical Exam

Relevant: [Admin Project → Deliverables → Practical Exam ]

 

Objectives:

  • Evaluate your manual testing skills, product evaluation skills, effort estimation skills
  • Peer-evaluate your product design , implementation effort , documentation quality

When, where: Week 13 lecture

Grading:

  • Your performance in the practical exam will be considered for your final grade (under the QA category and under Implementation category, about 10 marks in total).
  • You will be graded based on your effectiveness as a tester (e.g., the percentage of the bugs you found, the nature of the bugs you found) and how far off your evaluation/estimates are from the evaluator consensus. Explanation: we understand that you have limited expertise in this area; hence, we penalize only if your inputs don't seem to be based on a sincere effort to test/evaluate.
  • The bugs found in your product by others will affect your v1.4 marks. You will be given a chance to reject false-positive bug reports.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During:

  1. Take note of your team to test. It will be given to you by the teaching team (distributed via IVLE gradebook).
  2. Download from IVLE all files submitted by the team (i.e. jar file, User Guide, Developer Guide, and Project Portfolio Pages) into an empty folder.
  3. [~40 minutes] Test the product and report bugs as described below:
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

  1. [~50 minutes] Evaluate the following aspects. Note down your evaluation in a hard copy (as a backup). Submit via TEAMMATES.

    • A. Cohesiveness of product features []: Do the features fit together and match the stated target user and the value proposition?

      • unable to judge: You are unable to judge this aspect for some reason.
      • low: One of these
        • target user is too general  i.e. wider than AB4
        • target user and/or value proposition not clear from the user guide
        • features don't seem to fit together for the most part
      • medium: Some features fit together but some don't.
      • high: All features fit together but the features are not very high value to the target user.
      • excellent: The target user is clearly defined (not too general) and almost all new features are of high-value to the target user. i.e. the product is very attractive to the target user.
    • B. Quality of user docs []: Evaluate based on the parts of the user guide written by the person, as reproduced in the project portfolio. Evaluate from an end-user perspective.

      • unable to judge: Less than 1 page worth of UG content written by the student.
      • low: Hard to understand, often inaccurate or missing important information.
      • medium: Needs some effort to understand; some information is missing.
      • high: Mostly easy to follow. Only a few areas need improvements.
      • excellent: Easy to follow and accurate. Just enough information, visuals, examples etc. (not too much either). Understandable to the target end user.
    • C. Quality of developer docs []: Evaluate based on the developer docs cited/reproduced in the respective project portfolio page. Evaluate from the perspective of a new developer trying to understand how the features are implemented.

      • unable to judge: One of these
        • less than 0.5 pages worth of content.
        • other problems in the document  e.g. looks like included wrong content.
      • low: One of these
        • Very small amount of content (i.e., 0.5 - 1 page).
        • Hardly any use to the reader (i.e., content doesn't make much sense or redundant).
        • Uses ad-hoc diagrams where UML diagrams could have been used instead.
        • Multiple notation errors in UML diagrams.
      • medium: Some diagrams, some descriptions, but does not help the reader that much  e.g. overly complicated diagrams.
      • high: Enough diagrams (at lest two kinds of UML diagrams used) and enough descriptions (about 2 pages worth) but explanations are not always easy to follow.
      • excellent: Easy to follow. Just enough information (not too much). Minimum repetition of content/diagrams. Good use of diagrams to complement text descriptions. Easy to understand diagrams with just enough details rather than very complicated diagrams that are hard to understand.
    • D. Depth of feature []: Evaluate the feature done by the student for difficulty, depth, and completeness. Note: examples given below assume that AB4 did not have the commands edit, undo, and redo.

      • unable to judge: You are unable to judge this aspect for some reason.
      • low : An easy feature  e.g. make the existing find command case insensitive.
      • medium : Moderately difficult feature, barely acceptable implementation  e.g. an edit command that requires the user to type all fields, even the ones that are not being edited.
      • high: One of the below
        • A moderately difficult feature but fully implemented  e.g. an edit command that allows editing any field.
        • A difficult feature with a reasonable implementation but some aspects are not covered  undo/redo command that only allows a single undo/redo.
      • excellent: A difficult feature, all reasonable aspects are fully implemented  undo/redo command that allows multiple undo/redo.
    • E. Amount of work []: Evaluate the amount of work, on a scale of 0 to 30.

      • Consider this PR (history command) as 5 units of effort which means this PR (undo/redo command) is about 15 points of effort. Given that 30 points matches an effort twice as that needed for the undo/redo feature (which was given as an example of an A grade project), we expect most students to be have efforts lower than 20.
      • Consider the main feature only. Exclude GUI inputs, but consider GUI outputs of the feature. Count all implementation/testing/documentation work as mentioned in that person's PPP. Also look at the actual code written by the person. We understand that it is not possible to know exactly which part of the code is for the main feature; make a best-guess judgement call based on the available info.
      • Do not give a high value just to be nice. If your estimate is wildly inaccurate, it means you are unable to estimate the effort required to implement a feature in a project that you are supposed to know well at this point. You will lose marks if that is the case.

Processing PE Bug Reports:

There will be a review period for you to respond to the bug reports you received.

Duration: The review period will start around 1 day after the PE (exact time to be announced) and will last until the following Wednesday midnight. However, you are recommended to finish this task ASAP, to minimize cutting into your exam preparation work.

Bug reviewing is recommended to be done as a team as some of the decisions need team consensus.

Instructions for Reviewing Bug Reports

  • First, don't freak out if there are lot of bug reports. Many can be duplicates and some can be false positives. In any case, we anticipate that all of these products will have some bugs and our penalty for bugs is not harsh. Furthermore, it depends on the severity of the bug. Some bug may not even be penalized.

  • Do not edit the subject or the description. Do not close bug reports. Your response (if any) should be added as a comment.

  • If the bug is reported multiple times, mark all copies EXCEPT one as duplicates using the duplicate tag (if the duplicates have different severity levels, you should keep the one with the highest severity). In addition, use this technique to indicate which issue they are duplicates of. Duplicates can be omitted from processing steps given below.

  • If a bug seems to be for a different product (i.e. wrongly assigned to your team), let us know (email prof).

  • Decide if it is a real bug and apply ONLY one of these labels.

Response Labels:

  • response.Accepted: You accept it as a bug.
  • response.Rejected: What tester treated as a bug is in fact the expected behavior. The penalty for rejecting a bug using an unjustifiable explanation is higher than the penalty if the same bug was accepted. You can reject bugs that you inherited from AB4.
  • response.CannotReproduce: You are unable to reproduce the behavior reported in the bug after multiple tries.
  • response.IssueUnclear: The issue description is not clear.
  • If applicable, decide the type of bug. Bugs without type- are considered type-FunctionalityBug by default (which are liable to a heavier penalty):

Bug Type Labels:

  • type-FunctionalityBug : the bug is a flaw in how the product works.
  • type-DocumentationBug : the bug is in the documentation.
  • If you disagree with the original severity assigned to the bug, you may change it to the correct level, in which case add a comment justifying the change. All such changes will be double-checked by the teaching team and unreasonable lowering of severity will be penalized extra.:

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Decide who should fix the bug. Use the Assignees field to assign the issue to that person(s). There is no need to actually fix the bug though. It's simply an indication/acceptance of responsibility. If there is no assignee, we will distribute the penalty for that bug (if any) among all team members.

  • Add an explanatory comment explaining your choice of labels and assignees.

Time/venue: week 13 lecture slot



Project: Deliverables

Here is a list of main deliverables of the project; their details are given in the subsequent sections.

Deliverable: Executable

  • The product should be delivered as an executable jar file.
  • Ideally, the product delivered at v1.4 should be a releasable product. However, in the interest of lowering your workload, we do not penalize if the product is not releasable, as long as the product is acceptance testable.

Deliverable: Source code

  • The source code should match the executable, and should include the revision history of the source code, as a Git repo.

Deliverable: User Guide (UG)

  • The User Guide (UG) of the product should match the proposed v2.0 of the product and in sync with the current version of the product.
  • Features not implemented yet should be clearly marked as Coming in v2.0
  • Ensure the UG matches the product precisely, as it will be used by peer testers (and any inaccuracy in the content will be considered bugs).

Deliverable: Developer Guide (DG)

  • The Developer Guide (DG) of the product should match the proposed v2.0 of the product and should be in sync with the current version of the product.
  • The appendix named Instructions for Manual Testing of the Developer Guide should include testing instructions to cover the main enhancement of each team member. There is no need to add testing instructions for existing features if you did not touch them.
    💡 What to include in the appendix Instructions for Manual Testing? This appendix is meant to give some guidance to the tester to chart a path through the features, and provide some important test inputs the tester can copy-paste into the app. There is no need to give a long list of test cases including all possible variations. It is upto the tester to come up with those variations. However, if the instructions are inaccurate or deliberately misses/mis-states information to make testing harder  i.e. annoys the tester, the tester can report it as a bug  (because flaws in developer docs are considered as bugs).
  • Ensure the parted DG parts included in PPPs match the product precisely, as PPPs will be used by peer evaluators (and any inaccuracy in the content will be considered bugs).

Deliverable: Product Website

  • Include an updated version of the online UG and DG that match v1.4 executable
  • README : Ensure the Ui.png matches the current product
  • AboutUs : Ensure the following:
    • Use a suitable profile photo
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

  • Contains a link to each person's Project Portfolio page
  • Team member names match full names used by IVLE

Deliverable: Project Portfolio Page (PPP)

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Deliverable: Demo

  • Duration: Strictly 18 minutes for a 5-person team and 15 minutes for a 4-person team. Exceeding this limit will be penalized. Any set up time will be taken out of your allocated time.

  • Target audience: Assume you are giving a demo to a higher-level manager of your company, to brief him/her on the current capabilities of the product. This is the first time they are seeing the new product you developed but they are familiar with the AddressBook-level4 (AB4) product. The actual audience are the evaluators (the team supervisor and another tutor).

  • Scope:

    • Each person should demo the enhancements they added. However, it's ok for one member to do all the typing.
    • Subjected to the constraint mentioned in the previous point, as far as possible, organize the demo to present a cohesive picture of the product as a whole, presented in a logical order.  Remember to explain the profile of the target user profile and value proposition early in the demo.
    • It is recommended you showcase how the feature improves the user’s life rather than simply describe each feature.
    • No need to cover design/implementation details as the manager is not interested in those details.
    • Mention features you inherited from AB4 only if they are needed to explain your new features.  Reason: existing features will not earn you marks, and the audience is already familiar with AB4 features.
    • Each person should demo their main feature only. You are free to 'tie-in' other work under the main feature, but anything that cannot be tied-in to the main feature should be omitted from the demo (as those are not graded, showing them to the evaluators will only make the evaluation harder). For similar reasons, do not demo GUI inputs (but you can demo GUI outputs).
    • We recommend each person to start by giving an overview of the main feature before going into the details. That the evaluator informed of your main feature from the very start.
  • Structure:

    • Demo the product using the same executable you submitted, on your own laptop, using the TV.
    • It can be a sitting down demo: You'll be demonstrating the features using the TV while sitting down. But you may stand around the TV if you prefer that way.
    • It will be an uninterrupted demo: The audience members will not interrupt you during the demo. That means you should finish within the given time.
    • The demo should use a sufficient amount of realistic demo data.  e.g at least 20 contacts. Trying to demo a product using just 1-2 sample data creates a bad impression.
    • Dress code : The level of formality is up to you, but it is recommended that the whole team dress at the same level.
  • Optimizing the time:

    • Spend as much time as possible on demonstrating the actual product. Not recommended to use slides (if you do, use them sparingly) or videos or lengthy narrations.
      Avoid skits, re-enactments, dramatizations etc. This is not a sales pitch or an informercial. While you need to show how a user use the product to get value, but you don’t need to act like an imaginary user. For example, [Instead of this] Jim get’s a call from boss. "Ring ring", "hello", "oh hi Jim, can we postpone the meeting?" "Sure". Jim hang up and curses the boss under his breath. Now he starts typing ..etc. [do this] If Jim needs to postpone the meeting, he can type … It’s not that dramatization is bad or we don’t like it. We simply don’t have enough time for it.
      Note that CS2101 demo requirements may differ. Different context → Different requirements.
    • Rehearse the steps well and ensure you can do a smooth demo. Poor quality demos can affect your grade.
    • Don’t waste time repeating things the target audience already knows. e.g. no need to say things like "We are students from NUS, SoC".
    • Plan the demo to be in sync with the impression you want to create. For example, if you are trying to convince that the product is easy to use, show the easiest way to perform a task before you show the full command with all the bells and whistles.
  • Special circumstances:

    • If your main feature was not merged on time: inform the tutor and get permission to show the unmerged feature using your own version of the code. Obviously, unmerged features earn much less marks than a merged equivalent but something is better than nothing.
    • If you have no user visible features to show, you can still contribute to the demo by giving an overvie of the product (at the start) and/or giving a wrap of of the product (at the end).
    • If you are unable to come to the demo due to a valid reason, you can ask a team member to demo your feature. Remember to submit the evidence of your excuse e.g., MC to prof. The demo is part of module assessment and absence without a valid reason will cause you to lose marks.

Deliverable: Practical Exam (Dry Run)

What: The v1.3 is subjected to a round of peer acceptance/system testing, also called the Practical Exam Dry Run as this round of testing will be similar to the graded Practical Exam that will be done at v1.4.

When, where: uses a 30 minute slot at the start of week 11 lecture

 

Objectives:

  • Evaluate your manual testing skills, product evaluation skills, effort estimation skills
  • Peer-evaluate your product design , implementation effort , documentation quality

When, where: Week 13 lecture

Grading:

  • Your performance in the practical exam will be considered for your final grade (under the QA category and under Implementation category, about 10 marks in total).
  • You will be graded based on your effectiveness as a tester (e.g., the percentage of the bugs you found, the nature of the bugs you found) and how far off your evaluation/estimates are from the evaluator consensus. Explanation: we understand that you have limited expertise in this area; hence, we penalize only if your inputs don't seem to be based on a sincere effort to test/evaluate.
  • The bugs found in your product by others will affect your v1.4 marks. You will be given a chance to reject false-positive bug reports.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During:

  1. Take note of your team to test. It will be given to you by the teaching team (distributed via IVLE gradebook).
  2. Download from IVLE all files submitted by the team (i.e. jar file, User Guide, Developer Guide, and Project Portfolio Pages) into an empty folder.
  3. [~40 minutes] Test the product and report bugs as described below:
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

  1. [~50 minutes] Evaluate the following aspects. Note down your evaluation in a hard copy (as a backup). Submit via TEAMMATES.

    • A. Cohesiveness of product features []: Do the features fit together and match the stated target user and the value proposition?

      • unable to judge: You are unable to judge this aspect for some reason.
      • low: One of these
        • target user is too general  i.e. wider than AB4
        • target user and/or value proposition not clear from the user guide
        • features don't seem to fit together for the most part
      • medium: Some features fit together but some don't.
      • high: All features fit together but the features are not very high value to the target user.
      • excellent: The target user is clearly defined (not too general) and almost all new features are of high-value to the target user. i.e. the product is very attractive to the target user.
    • B. Quality of user docs []: Evaluate based on the parts of the user guide written by the person, as reproduced in the project portfolio. Evaluate from an end-user perspective.

      • unable to judge: Less than 1 page worth of UG content written by the student.
      • low: Hard to understand, often inaccurate or missing important information.
      • medium: Needs some effort to understand; some information is missing.
      • high: Mostly easy to follow. Only a few areas need improvements.
      • excellent: Easy to follow and accurate. Just enough information, visuals, examples etc. (not too much either). Understandable to the target end user.
    • C. Quality of developer docs []: Evaluate based on the developer docs cited/reproduced in the respective project portfolio page. Evaluate from the perspective of a new developer trying to understand how the features are implemented.

      • unable to judge: One of these
        • less than 0.5 pages worth of content.
        • other problems in the document  e.g. looks like included wrong content.
      • low: One of these
        • Very small amount of content (i.e., 0.5 - 1 page).
        • Hardly any use to the reader (i.e., content doesn't make much sense or redundant).
        • Uses ad-hoc diagrams where UML diagrams could have been used instead.
        • Multiple notation errors in UML diagrams.
      • medium: Some diagrams, some descriptions, but does not help the reader that much  e.g. overly complicated diagrams.
      • high: Enough diagrams (at lest two kinds of UML diagrams used) and enough descriptions (about 2 pages worth) but explanations are not always easy to follow.
      • excellent: Easy to follow. Just enough information (not too much). Minimum repetition of content/diagrams. Good use of diagrams to complement text descriptions. Easy to understand diagrams with just enough details rather than very complicated diagrams that are hard to understand.
    • D. Depth of feature []: Evaluate the feature done by the student for difficulty, depth, and completeness. Note: examples given below assume that AB4 did not have the commands edit, undo, and redo.

      • unable to judge: You are unable to judge this aspect for some reason.
      • low : An easy feature  e.g. make the existing find command case insensitive.
      • medium : Moderately difficult feature, barely acceptable implementation  e.g. an edit command that requires the user to type all fields, even the ones that are not being edited.
      • high: One of the below
        • A moderately difficult feature but fully implemented  e.g. an edit command that allows editing any field.
        • A difficult feature with a reasonable implementation but some aspects are not covered  undo/redo command that only allows a single undo/redo.
      • excellent: A difficult feature, all reasonable aspects are fully implemented  undo/redo command that allows multiple undo/redo.
    • E. Amount of work []: Evaluate the amount of work, on a scale of 0 to 30.

      • Consider this PR (history command) as 5 units of effort which means this PR (undo/redo command) is about 15 points of effort. Given that 30 points matches an effort twice as that needed for the undo/redo feature (which was given as an example of an A grade project), we expect most students to be have efforts lower than 20.
      • Consider the main feature only. Exclude GUI inputs, but consider GUI outputs of the feature. Count all implementation/testing/documentation work as mentioned in that person's PPP. Also look at the actual code written by the person. We understand that it is not possible to know exactly which part of the code is for the main feature; make a best-guess judgement call based on the available info.
      • Do not give a high value just to be nice. If your estimate is wildly inaccurate, it means you are unable to estimate the effort required to implement a feature in a project that you are supposed to know well at this point. You will lose marks if that is the case.

Processing PE Bug Reports:

There will be a review period for you to respond to the bug reports you received.

Duration: The review period will start around 1 day after the PE (exact time to be announced) and will last until the following Wednesday midnight. However, you are recommended to finish this task ASAP, to minimize cutting into your exam preparation work.

Bug reviewing is recommended to be done as a team as some of the decisions need team consensus.

Instructions for Reviewing Bug Reports

  • First, don't freak out if there are lot of bug reports. Many can be duplicates and some can be false positives. In any case, we anticipate that all of these products will have some bugs and our penalty for bugs is not harsh. Furthermore, it depends on the severity of the bug. Some bug may not even be penalized.

  • Do not edit the subject or the description. Do not close bug reports. Your response (if any) should be added as a comment.

  • If the bug is reported multiple times, mark all copies EXCEPT one as duplicates using the duplicate tag (if the duplicates have different severity levels, you should keep the one with the highest severity). In addition, use this technique to indicate which issue they are duplicates of. Duplicates can be omitted from processing steps given below.

  • If a bug seems to be for a different product (i.e. wrongly assigned to your team), let us know (email prof).

  • Decide if it is a real bug and apply ONLY one of these labels.

Response Labels:

  • response.Accepted: You accept it as a bug.
  • response.Rejected: What tester treated as a bug is in fact the expected behavior. The penalty for rejecting a bug using an unjustifiable explanation is higher than the penalty if the same bug was accepted. You can reject bugs that you inherited from AB4.
  • response.CannotReproduce: You are unable to reproduce the behavior reported in the bug after multiple tries.
  • response.IssueUnclear: The issue description is not clear.
  • If applicable, decide the type of bug. Bugs without type- are considered type-FunctionalityBug by default (which are liable to a heavier penalty):

Bug Type Labels:

  • type-FunctionalityBug : the bug is a flaw in how the product works.
  • type-DocumentationBug : the bug is in the documentation.
  • If you disagree with the original severity assigned to the bug, you may change it to the correct level, in which case add a comment justifying the change. All such changes will be double-checked by the teaching team and unreasonable lowering of severity will be penalized extra.:

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Decide who should fix the bug. Use the Assignees field to assign the issue to that person(s). There is no need to actually fix the bug though. It's simply an indication/acceptance of responsibility. If there is no assignee, we will distribute the penalty for that bug (if any) among all team members.

  • Add an explanatory comment explaining your choice of labels and assignees.

Grading: Taking part in the PE dry run is strongly encouraged as it can affect your grade in the following ways.

  • If the product you are allocated to test in the Practical Exam (at v1.4) had a very low bug count, we will consider your performance in PE dry run as well when grading the PE.
  • PE dry run will help you practice for the actual PE.
  • Taking part in the PE dry run will earn you participation points.
  • There is no penalty for bugs reported in your product. Every bug you find is a win-win for you and the team whose product you are testing.

Objectives:

  • To train you to do manual testing, bug reporting, bug triaging, bug fixing, communicating with users/testers/developers, evaluating products etc.
  • To help you improve your product before the final submission.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During the session:

  1. Take note of your team to test. Distributed via IVLE gradebook and via email.
  2. Download the latest jar file from the team's GitHub page. Copy it to an empty folder.
  3. Confirm you are testing the allocated product by comparing the product UI with the UI screenshot sent via email.
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

After the session:

  • We'll transfer the relevant bug reports to your repo over the weekend. Once you have received the bug reports for your product, it is up to you to decide whether you will act on reported issues before the final submission v1.4. For some issues, the correct decision could be to reject or postpone to a version beyond v1.4.
  • You can post in the issue thread to communicate with the tester e.g. to ask for more info, etc. However, the tester is not obliged to respond.
    • 💡 Do not argue with the issue reporter to try to convince that person that your way is correct/better. If at all, you can gently explain the rationale for the current behavior but do not waste time getting involved in long arguments. If you think the suggestion/bug is unreasonable, just thank the reporter for their view and close the issue.

Deliverable: Practical Exam

Objectives:

  • Evaluate your manual testing skills, product evaluation skills, effort estimation skills
  • Peer-evaluate your product design , implementation effort , documentation quality

When, where: Week 13 lecture

Grading:

  • Your performance in the practical exam will be considered for your final grade (under the QA category and under Implementation category, about 10 marks in total).
  • You will be graded based on your effectiveness as a tester (e.g., the percentage of the bugs you found, the nature of the bugs you found) and how far off your evaluation/estimates are from the evaluator consensus. Explanation: we understand that you have limited expertise in this area; hence, we penalize only if your inputs don't seem to be based on a sincere effort to test/evaluate.
  • The bugs found in your product by others will affect your v1.4 marks. You will be given a chance to reject false-positive bug reports.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During:

  1. Take note of your team to test. It will be given to you by the teaching team (distributed via IVLE gradebook).
  2. Download from IVLE all files submitted by the team (i.e. jar file, User Guide, Developer Guide, and Project Portfolio Pages) into an empty folder.
  3. [~40 minutes] Test the product and report bugs as described below:
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

  1. [~50 minutes] Evaluate the following aspects. Note down your evaluation in a hard copy (as a backup). Submit via TEAMMATES.

    • A. Cohesiveness of product features []: Do the features fit together and match the stated target user and the value proposition?

      • unable to judge: You are unable to judge this aspect for some reason.
      • low: One of these
        • target user is too general  i.e. wider than AB4
        • target user and/or value proposition not clear from the user guide
        • features don't seem to fit together for the most part
      • medium: Some features fit together but some don't.
      • high: All features fit together but the features are not very high value to the target user.
      • excellent: The target user is clearly defined (not too general) and almost all new features are of high-value to the target user. i.e. the product is very attractive to the target user.
    • B. Quality of user docs []: Evaluate based on the parts of the user guide written by the person, as reproduced in the project portfolio. Evaluate from an end-user perspective.

      • unable to judge: Less than 1 page worth of UG content written by the student.
      • low: Hard to understand, often inaccurate or missing important information.
      • medium: Needs some effort to understand; some information is missing.
      • high: Mostly easy to follow. Only a few areas need improvements.
      • excellent: Easy to follow and accurate. Just enough information, visuals, examples etc. (not too much either). Understandable to the target end user.
    • C. Quality of developer docs []: Evaluate based on the developer docs cited/reproduced in the respective project portfolio page. Evaluate from the perspective of a new developer trying to understand how the features are implemented.

      • unable to judge: One of these
        • less than 0.5 pages worth of content.
        • other problems in the document  e.g. looks like included wrong content.
      • low: One of these
        • Very small amount of content (i.e., 0.5 - 1 page).
        • Hardly any use to the reader (i.e., content doesn't make much sense or redundant).
        • Uses ad-hoc diagrams where UML diagrams could have been used instead.
        • Multiple notation errors in UML diagrams.
      • medium: Some diagrams, some descriptions, but does not help the reader that much  e.g. overly complicated diagrams.
      • high: Enough diagrams (at lest two kinds of UML diagrams used) and enough descriptions (about 2 pages worth) but explanations are not always easy to follow.
      • excellent: Easy to follow. Just enough information (not too much). Minimum repetition of content/diagrams. Good use of diagrams to complement text descriptions. Easy to understand diagrams with just enough details rather than very complicated diagrams that are hard to understand.
    • D. Depth of feature []: Evaluate the feature done by the student for difficulty, depth, and completeness. Note: examples given below assume that AB4 did not have the commands edit, undo, and redo.

      • unable to judge: You are unable to judge this aspect for some reason.
      • low : An easy feature  e.g. make the existing find command case insensitive.
      • medium : Moderately difficult feature, barely acceptable implementation  e.g. an edit command that requires the user to type all fields, even the ones that are not being edited.
      • high: One of the below
        • A moderately difficult feature but fully implemented  e.g. an edit command that allows editing any field.
        • A difficult feature with a reasonable implementation but some aspects are not covered  undo/redo command that only allows a single undo/redo.
      • excellent: A difficult feature, all reasonable aspects are fully implemented  undo/redo command that allows multiple undo/redo.
    • E. Amount of work []: Evaluate the amount of work, on a scale of 0 to 30.

      • Consider this PR (history command) as 5 units of effort which means this PR (undo/redo command) is about 15 points of effort. Given that 30 points matches an effort twice as that needed for the undo/redo feature (which was given as an example of an A grade project), we expect most students to be have efforts lower than 20.
      • Consider the main feature only. Exclude GUI inputs, but consider GUI outputs of the feature. Count all implementation/testing/documentation work as mentioned in that person's PPP. Also look at the actual code written by the person. We understand that it is not possible to know exactly which part of the code is for the main feature; make a best-guess judgement call based on the available info.
      • Do not give a high value just to be nice. If your estimate is wildly inaccurate, it means you are unable to estimate the effort required to implement a feature in a project that you are supposed to know well at this point. You will lose marks if that is the case.

Processing PE Bug Reports:

There will be a review period for you to respond to the bug reports you received.

Duration: The review period will start around 1 day after the PE (exact time to be announced) and will last until the following Wednesday midnight. However, you are recommended to finish this task ASAP, to minimize cutting into your exam preparation work.

Bug reviewing is recommended to be done as a team as some of the decisions need team consensus.

Instructions for Reviewing Bug Reports

  • First, don't freak out if there are lot of bug reports. Many can be duplicates and some can be false positives. In any case, we anticipate that all of these products will have some bugs and our penalty for bugs is not harsh. Furthermore, it depends on the severity of the bug. Some bug may not even be penalized.

  • Do not edit the subject or the description. Do not close bug reports. Your response (if any) should be added as a comment.

  • If the bug is reported multiple times, mark all copies EXCEPT one as duplicates using the duplicate tag (if the duplicates have different severity levels, you should keep the one with the highest severity). In addition, use this technique to indicate which issue they are duplicates of. Duplicates can be omitted from processing steps given below.

  • If a bug seems to be for a different product (i.e. wrongly assigned to your team), let us know (email prof).

  • Decide if it is a real bug and apply ONLY one of these labels.

Response Labels:

  • response.Accepted: You accept it as a bug.
  • response.Rejected: What tester treated as a bug is in fact the expected behavior. The penalty for rejecting a bug using an unjustifiable explanation is higher than the penalty if the same bug was accepted. You can reject bugs that you inherited from AB4.
  • response.CannotReproduce: You are unable to reproduce the behavior reported in the bug after multiple tries.
  • response.IssueUnclear: The issue description is not clear.
  • If applicable, decide the type of bug. Bugs without type- are considered type-FunctionalityBug by default (which are liable to a heavier penalty):

Bug Type Labels:

  • type-FunctionalityBug : the bug is a flaw in how the product works.
  • type-DocumentationBug : the bug is in the documentation.
  • If you disagree with the original severity assigned to the bug, you may change it to the correct level, in which case add a comment justifying the change. All such changes will be double-checked by the teaching team and unreasonable lowering of severity will be penalized extra.:

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Decide who should fix the bug. Use the Assignees field to assign the issue to that person(s). There is no need to actually fix the bug though. It's simply an indication/acceptance of responsibility. If there is no assignee, we will distribute the penalty for that bug (if any) among all team members.

  • Add an explanatory comment explaining your choice of labels and assignees.

Notes for Those Using AB-2 or AB-3 for the Project

There is no explicit penalty for switching to a lower level AB. All projects are evaluated based on the same yardstick irrespective of on which AB it is based. As an AB is given to you as a 'free' head-start, a lower level AB gives you a shorter head-start, which means your final product is likely to be less functional than those from teams using AB-4 unless you progress faster than them. Nevertheless, you should switch to AB2/3 if you feel you can learn more from the project that way, as our goal is to maximize learning, not features.
If your team wants to stay with AB-4 but you want to switch to a lower level AB, let the us know so that we can work something out for you.

If you have opted to use AB-2 or AB-3 instead of AB-4 as the basis of your product, please note the following points:

 

Set up project repo, start moving UG and DG to the repo, attempt to do local-impact changes to the code base.

Project Management:

Set up the team org and the team repo as explained below:

Relevant: [Admin Appendix E(extract): Organization setup ]

 

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Relevant: [Admin Appendix E(extract): Repo setup ]

 

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

When updating code in the repo, follow the workflow explained below:

Relevant: [Admin Appendix E(extract): Workflow ]

 

Workflow

Before you do any coding for the project,

  • Ensure you have set the Git username correctly (as explained in Appendix E) in all Computers you use for coding.
  • Read our reuse policy (in Admin: Appendix B), in particular, how to give credit when you reuse code from the Internet or classmates:
 

Setting Git Username to Match GitHub Username

We use various tools to analyze your code. For us to be able to identify your commits, you should use the GitHub username as your Git username as well. If there is a mismatch, or if you use multiple user names for Git, our tools might miss some of your work and as a result you might not get credit for some of your work.

In each Computer you use for coding, after installing Git, you should set the Git username as follows.

  1. Open a command window that can run Git commands (e.g., Git bash window)
  2. Run the command git config --global user.name YOUR_GITHUB_USERNAME
    e.g., git config --global user.name JohnDoe

More info about setting Git username is here.

 

Policy on reuse

Reuse is encouraged. However, note that reuse has its own costs (such as the learning curve, additional complexity, usage restrictions, and unknown bugs). Furthermore, you will not be given credit for work done by others. Rather, you will be given credit for using work done by others.

  • You are allowed to reuse work from your classmates, subject to following conditions:
    • The work has been published by us or the authors.
    • You clearly give credit to the original author(s).
  • You are allowed to reuse work from external sources, subject to following conditions:
    • The work comes from a source of 'good standing' (such as an established open source project). This means you cannot reuse code written by an outside 'friend'.
    • You clearly give credit to the original author. Acknowledge use of third party resources clearly e.g. in the welcome message, splash screen (if any) or under the 'about' menu. If you are open about reuse, you are less likely to get into trouble if you unintentionally reused something copyrighted.
    • You do not violate the license under which the work has been released. Please  do not use 3rd-party images/audio in your software unless they have been specifically released to be used freely. Just because you found it in the Internet does not mean it is free for reuse.
    • Always get permission from us before you reuse third-party libraries. Please post your 'request to use 3rd party library' in our forum. That way, the whole class get to see what libraries are being used by others.

Giving credit for reused work

Given below are how to give credit for things you reuse from elsewhere. These requirements are specific to this module  i.e., not applicable outside the module (outside the module you should follow the rules specified by your employer and the license of the reused work)

If you used a third party library:

  • Mention in the README.adoc (under the Acknowledgements section)
  • mention in the Project Portfolio Page if the library has a significant relevance to the features you implemented

If you reused code snippets found on the Internet  e.g. from StackOverflow answers or
referred code in another software or
referred project code by current/past student:

  • If you read the code to understand the approach and implemented it yourself, mention it as a comment
    Example:
    //Solution below adapted from https://stackoverflow.com/a/16252290
    {Your implmentation of the reused solution here ...}
    
  • If you copy-pasted a non-trivial code block (possibly with minor modifications  renaming, layout changes, changes to comments, etc.), also mark the code block as reused code (using @@author tags)
    Format:
    //@@author {yourGithubUsername}-reused
    //{Info about the source...}
    
    {Reused code (possibly with minor modifications) here ...}
    
    //@@author
    
    Example of reusing a code snippet (with minor modifications):
    persons = getList()
    //@@author johndoe-reused
    //Reused from https://stackoverflow.com/a/34646172 with minor modifications
    Collections.sort(persons, new Comparator<CustomData>() {
        @Override
        public int compare(CustomData lhs, CustomData rhs) {
            return lhs.customInt > rhs.customInt ? -1 : (lhs.customInt < rhs.customInt) ? 1 : 0;
        }
    });
    //@@author
    return persons;
    
 

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Follow the forking workflow in your project up to v1.1. In particular,

  • Get team members to review PRs. A workflow without PR reviews is a risky workflow.
  • Do not merge PRs failing CI. After setting up Travis, the CI status of a PR is reported at the bottom of the PR page. The screenshot below shows the status of a PR that is passing all CI checks.

    If there is a failure, you can click on the Details link in corresponding line to find out more about the failure. Once you figure out the cause of the failure, push the a fix to the PR.
  • After setting up Netlify, you can use Netlify PR Preview to preview changes to documentation files, if the PR contains updates to documentation. To see the preview, click on the Details link in front of the Netlify status reported (refer screenshot above).

After completing v1.1, you can adjust process rigor to suit your team's pace, as explained below.

  • Reduce automated tests have benefits, but they can be a pain to write/maintain; GUI tests are especially hard to maintain because their behavior can sometimes depend on things such as the OS, resolution etc.
    It is OK to get rid of some of the troublesome tests and rely more on manual testing instead. The less automated tests you have, the higher the risk of regressions; but it may be an acceptable trade-off under the circumstances if tests are slowing you down too much.
    There is no direct penalty for removing GUI tests. Also note our expectation on test code.

  • Reduce automated checks: You can also reduce the rigor of checkstyle checks to expedite PR processing.

  • Switch to a lighter workflow: While forking workflow is the safest, it is also rather heavy. You an switch to a simpler workflow if the forking workflow is slowing you down. Refer the textbook to find more about alternative workflows: branching workflow, centralized workflow. However, we still recommend that you use PR reviews, at least for PRs affecting others' features.

You can also increase the rigor/safety of your workflow in the following ways:

  • Use GitHub's Protected Branches feature to protect your master branch against rogue PRs.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.
 

Project Management → Revision Control →

Forking Flow

In the forking workflow, the 'official' version of the software is kept in a remote repo designated as the 'main repo'. All team members fork the main repo create pull requests from their fork to the main repo.

To illustrate how the workflow goes, let’s assume Jean wants to fix a bug in the code. Here are the steps:

  1. Jean creates a separate branch in her local repo and fixes the bug in that branch.
  2. Jean pushes the branch to her fork.
  3. Jean creates a pull request from that branch in her fork to the main repo.
  4. Other members review Jean’s pull request.
  5. If reviewers suggested any changes, Jean updates the PR accordingly.
  6. When reviewers are satisfied with the PR, one of the members (usually the team lead or a designated 'maintainer' of the main repo) merges the PR, which brings Jean’s code to the main repo.
  7. Other members, realizing there is new code in the upstream repo, sync their forks with the new upstream repo (i.e. the main repo). This is done by pulling the new code to their own local repo and pushing the updated code to their own fork.

Documentation:

Recommended procedure for updating docs:

  1. Divide among yourselves who will update which parts of the document(s).
  2. Update the team repo by following the workflow mentioned above.

Update the following pages in your project repo:

  • About Us page: This page is used for module admin purposes. Please follow the format closely or else our scripts will not be able to give credit for your work.
    • Replace info of SE-EDU developers with info of your team, including a suitable photo as described here.
    • Including the name/photo of the supervisor/lecturer is optional.
    • The photo of a team member should be doc/images/githbub_username_in_lower_case.png e.g. docs/images/damithc.png. If you photo is in jpg format, name the file as .png anyway.
    • Indicate the different roles played and responsibilities held by each team member. You can reassign these roles and responsibilities (as explained in Admin Project Scope) later in the project, if necessary.
 
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

 

Roles indicate aspects you are in charge of and responsible for. E.g., if you are in charge of documentation, you are the person who should allocate which parts of the documentation is to be done by who, ensure the document is in right format, ensure consistency etc.

This is a non-exhaustive list; you may define additional roles.

  • Team lead: Responsible for overall project coordination.
  • Documentation (short for ‘in charge of documentation’): Responsible for the quality of various project documents.
  • Testing: Ensures the testing of the project is done properly and on time.
  • Code quality: Looks after code quality, ensures adherence to coding standards, etc.
  • Deliverables and deadlines: Ensure project deliverables are done on time and in the right format.
  • Integration: In charge of versioning of the code, maintaining the code repository, integrating various parts of the software to create a whole.
  • Scheduling and tracking: In charge of defining, assigning, and tracking project tasks.
  • [Tool ABC] expert: e.g. Intellij expert, Git expert, etc. Helps other team member with matters related to the specific tool.
  • In charge of[Component XYZ]: e.g. In charge of Model, UI, Storage, etc. If you are in charge of a component, you are expected to know that component well, and review changes done to that component in v1.3-v1.4.

Please make sure each of the important roles are assigned to one person in the team. It is OK to have a 'backup' for each role, but for each aspect there should be one person who is unequivocally the person responsible for it.

  • Contact Us Page: Update to match your product.

  • README.adoc page: Update it to match your project.

    • Add a UI mockup of your intended final product.
      Note that the image of the UI should be docs/images/Ui.png so that it can be downloaded by our scripts. Limit the file to contain one screenshot/mockup only and ensure the new image is roughly the same height x width proportions as the original one. Reason: when we compile these images from all teams into one page (example), yours should not look out of place.

    • The original README.adoc file (which doubles as the landing page of your project website) is written to read like the introduction to an SE learning/teaching resource. You should restructure this page to look like the home page of a real product (not a school project) targeting real users  e.g. remove references to addressbook-level3, Learning Outcomes etc. mention target users, add a marketing blurb etc. On a related note, also remove Learning Outcomes link and related pages.

    • Update the link of the Travis build status badge (Build Status) so that it reflects the build status of your team repo.
      For the other badges,

      • either set up the respective tool for your project (AB-4 Developer Guide has instructions on how to set up AppVeyor and Coveralls) and update the badges accordingly,
      • or remove the badge.
    • Acknowledge the original source of the code i.e. AddressBook-Level4 project created by SE-EDU initiative at https://github.com/se-edu/

  • User Guide: Start moving the content from your User Guide (draft created in previous weeks) into the User Guide page in your repository. If a feature is not implemented, mark it as 'Coming in v2.0' (example).

  • Developer Guide: Similar to the User Guide, start moving the content from your Developer Guide (draft created in previous weeks) into the Developer Guide page in your team repository.

Product:

  • Each member can attempt to do a local-impact change to the code base.

    Objective: To familiarize yourself with at least one components of the product.

    Description: Divide the components among yourselves. Each member can do some small enhancements to their component(s) to learn the code of that component. Some suggested enhancements are given in the AddressBook-Level4 developer guide.

    Submission: Create PRs from your own fork to your team repo. Get it merged by following your team's workflow.



Project: Assessment

Note that project grading is not competitive (not bell curved). CS2103T projects will be assessed separately from CS2103 projects. This is to account for the perceived difference in workload. Given below is the marking scheme.

Total: 50 marks ( 40 individual marks + 10 team marks)

Evaluates: How well do your features fit together to form a cohesive product (not how many features or how big the features are)?

Based on: user guide and the product demo. The quality of the demo will be factored in as well.

💡 Feature that fit well with the other features will earn more marks.

Evaluates:

A. Code quality/quantity:

How good your implementation is, in terms of the quality and the quantity of the code you have written yourself.

Based on: an inspection of the parts of the code you claim as written by you.

  • Ensure your code has at least some evidence of these (see here for more info)

    • logging
    • exceptions
    • assertions
    • defensive coding
  • Ensure there are no coding standard violations  e.g. all boolean variables/methods sounds like booleans. Checkstyle can prevent only some coding standard violations; others need to be checked manually.

  • Ensure SLAP is applied at a reasonable level. Long methods or deeply-nested code are symptoms of low-SLAP may be counted against your code quality.

  • Reduce code duplications  i.e. if there multiple blocks of code that vary only in minor ways, try to extract out similarities into one place, especially in test code.

  • In addition, try to apply as many of the code quality guidelines covered in the module as much as you can.

 

Code Quality

Introduction

Basic

Can explain the importance of code quality

Always code as if the person who ends up maintaining your code will be a violent psychopath who knows where you live. -- Martin Golding

Production code needs to be of high quality . Given how the world is becoming increasingly dependent of software, poor quality code is something we cannot afford to tolerate.

Code being used in an actual product with actual users

Guideline: Maximise Readability

Introduction

Can explain the importance of readability

Programs should be written and polished until they acquire publication quality. --Niklaus Wirth

Among various dimensions of code quality, such as run-time efficiency, security, and robustness, one of the most important is understandability. This is because in any non-trivial software project, code needs to be read, understood, and modified by other developers later on. Even if we do not intend to pass the code to someone else, code quality is still important because we all become 'strangers' to our own code someday.

The two code samples given below achieve the same functionality, but one is easier to read.

     

Bad

int subsidy() {
    int subsidy;
    if (!age) {
        if (!sub) {
            if (!notFullTime) {
                subsidy = 500;
            } else {
                subsidy = 250;
            }
        } else {
            subsidy = 250;
        }
    } else {
        subsidy = -1;
    }
    return subsidy;
}

  

Good

int calculateSubsidy() {
    int subsidy;
    if (isSenior) {
        subsidy = REJECT_SENIOR;
    } else if (isAlreadySubsidised) {
        subsidy = SUBSIDISED_SUBSIDY;
    } else if (isPartTime) {
        subsidy = FULLTIME_SUBSIDY * RATIO;
    } else {
        subsidy = FULLTIME_SUBSIDY;
    }
    return subsidy;
}

     

Bad

def calculate_subs():
    if not age:
        if not sub:
            if not not_fulltime:
                subsidy = 500
            else:
                subsidy = 250
        else:
            subsidy = 250
    else:
        subsidy = -1
    return subsidy

  

Good

def calculate_subsidy():
    if is_senior:
        return REJECT_SENIOR
    elif is_already_subsidised:
        return SUBSIDISED_SUBSIDY
    elif is_parttime:
        return FULLTIME_SUBSIDY * RATIO
    else:
        return FULLTIME_SUBSIDY

Basic

Avoid Long Methods

Can improve code quality using technique: avoid long methods

Be wary when a method is longer than the computer screen, and take corrective action when it goes beyond 30 LOC (lines of code). The bigger the haystack, the harder it is to find a needle.

Avoid Deep Nesting

Can improve code quality using technique: avoid deep nesting

If you need more than 3 levels of indentation, you're screwed anyway, and should fix your program. --Linux 1.3.53 CodingStyle

In particular, avoid arrowhead style code.

Example:

Avoid Complicated Expressions

Can improve code quality using technique: avoid complicated expressions

Avoid complicated expressions, especially those having many negations and nested parentheses. If you must evaluate complicated expressions, have it done in steps (i.e. calculate some intermediate values first and use them to calculate the final value).

Example:

Bad

return ((length < MAX_LENGTH) || (previousSize != length)) && (typeCode == URGENT);

Good


boolean isWithinSizeLimit = length < MAX_LENGTH;
boolean isSameSize = previousSize != length;
boolean isValidCode = isWithinSizeLimit || isSameSize;

boolean isUrgent = typeCode == URGENT;

return isValidCode && isUrgent;

Example:

Bad

return ((length < MAX_LENGTH) or (previous_size != length)) and (type_code == URGENT)

Good

is_within_size_limit = length < MAX_LENGTH
is_same_size = previous_size != length
is_valid_code = is_within_size_limit or is_same_size

is_urgent = type_code == URGENT

return is_valid_code and is_urgent

The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. -- Edsger Dijkstra

Avoid Magic Numbers

Can improve code quality using technique: avoid magic numbers

When the code has a number that does not explain the meaning of the number, we call that a magic number (as in “the number appears as if by magic”). Using a named constant makes the code easier to understand because the name tells us more about the meaning of the number.

Example:

     

Bad

return 3.14236;
...
return 9;

  

Good

static final double PI = 3.14236;
static final int MAX_SIZE = 10;
...
return PI;
...
return MAX_SIZE-1;

Note: Python does not have a way to make a variable a constant. However, you can use a normal variable with an ALL_CAPS name to simulate a constant.

     

Bad

return 3.14236
...
return 9

  

Good

PI = 3.14236
MAX_SIZE = 10
...
return PI
...
return MAX_SIZE-1

Similarly, we can have ‘magic’ values of other data types.

Bad

"Error 1432"  // A magic string!

Make the Code Obvious

Can improve code quality using technique: make the code obvious

Make the code as explicit as possible, even if the language syntax allows them to be implicit. Here are some examples:

  • [Java] Use explicit type conversion instead of implicit type conversion.
  • [Java, Python] Use parentheses/braces to show grouping even when they can be skipped.
  • [Java, Python] Use enumerations when a certain variable can take only a small number of finite values. For example, instead of declaring the variable 'state' as an integer and using values 0,1,2 to denote the states 'starting', 'enabled', and 'disabled' respectively, declare 'state' as type SystemState and define an enumeration SystemState that has values 'STARTING', 'ENABLED', and 'DISABLED'.

Intermediate

Structure Code Logically

Can improve code quality using technique: structure code logically

Lay out the code so that it adheres to the logical structure. The code should read like a story. Just like we use section breaks, chapters and paragraphs to organize a story, use classes, methods, indentation and line spacing in your code to group related segments of the code. For example, you can use blank lines to group related statements together. Sometimes, the correctness of your code does not depend on the order in which you perform certain intermediary steps. Nevertheless, this order may affect the clarity of the story you are trying to tell. Choose the order that makes the story most readable.

Do Not 'Trip Up' Reader

Can improve code quality using technique: do not 'trip up' reader

Avoid things that would make the reader go ‘huh?’, such as,

  • unused parameters in the method signature
  • similar things look different
  • different things that look similar
  • multiple statements in the same line
  • data flow anomalies such as, pre-assigning values to variables and modifying it without any use of the pre-assigned value

Practice KISSing

Can improve code quality using technique: practice kissing

As the old adage goes, "keep it simple, stupid” (KISS). Do not try to write ‘clever’ code. For example, do not dismiss the brute-force yet simple solution in favor of a complicated one because of some ‘supposed benefits’ such as 'better reusability' unless you have a strong justification.

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. --Brian W. Kernighan

Programs must be written for people to read, and only incidentally for machines to execute. --Abelson and Sussman

Avoid Premature Optimizations

Can improve code quality using technique: avoid premature optimizations

Optimizing code prematurely has several drawbacks:

  • We may not know which parts are the real performance bottlenecks. This is especially the case when the code undergoes transformations (e.g. compiling, minifying, transpiling, etc.) before it becomes an executable. Ideally, you should use a profiler tool to identify the actual bottlenecks of the code first, and optimize only those parts.
  • Optimizing can complicate the code, affecting correctness and understandability
  • Hand-optimized code can be harder for the compiler to optimize (the simpler the code, the easier for the compiler to optimize it). In many cases a compiler can do a better job of optimizing the runtime code if you don't get in the way by trying to hand-optimize the source code.

A popular saying in the industry is make it work, make it right, make it fast which means in most cases getting the code to perform correctly should take priority over optimizing it. If the code doesn't work correctly, it has no value on matter how fast/efficient it it.

Premature optimization is the root of all evil in programming. --Donald Knuth

Note that there are cases where optimizing takes priority over other things e.g. when writing code for resource-constrained environments. This guideline simply a caution that you should optimize only when it is really needed.

SLAP Hard

Can improve code quality using technique: SLAP hard

Avoid varying the level of abstraction within a code fragment. Note: The Productive Programmer (by Neal Ford) calls this the SLAP principle i.e. Single Level of Abstraction Per method.

Example:

Bad

readData();
salary = basic*rise+1000;
tax = (taxable?salary*0.07:0);
displayResult();

Good

readData();
processData();
displayResult();
 

Design → Design Fundamentals → Abstraction →

What

Abstraction is a technique for dealing with complexity. It works by establishing a level of complexity we are interested in, and suppressing the more complex details below that level.

The guiding principle of abstraction is that only details that are relevant to the current perspective or the task at hand needs to be considered. As most programs are written to solve complex problems involving large amounts of intricate details, it is impossible to deal with all these details at the same time. That is where abstraction can help.

Ignoring lower level data items and thinking in terms of bigger entities is called data abstraction.

Within a certain software component, we might deal with a user data type, while ignoring the details contained in the user data item such as name, and date of birth. These details have been ‘abstracted away’ as they do not affect the task of that software component.

Control abstraction abstracts away details of the actual control flow to focus on tasks at a simplified level.

print(“Hello”) is an abstraction of the actual output mechanism within the computer.

Abstraction can be applied repeatedly to obtain progressively higher levels of abstractions.

An example of different levels of data abstraction: a File is a data item that is at a higher level than an array and an array is at a higher level than a bit.

An example of different levels of control abstraction: execute(Game) is at a higher level than print(Char) which is at a higher than an Assembly language instruction MOV.

Abstraction is a general concept that is not limited to just data or control abstractions.

Some more general examples of abstraction:

  • An OOP class is an abstraction over related data and behaviors.
  • An architecture is a higher-level abstraction of the design of a software.
  • Models (e.g., UML models) are abstractions of some aspect of reality.

Advanced

Make the Happy Path Prominent

Can improve code quality using technique: make the happy path prominent

The happy path (i.e. the execution path taken when everything goes well) should be clear and prominent in your code. Restructure the code to make the happy path unindented as much as possible. It is the ‘unusual’ cases that should be indented. Someone reading the code should not get distracted by alternative paths taken when error conditions happen. One technique that could help in this regard is the use of guard clauses.

Example:

Bad

if (!isUnusualCase) {  //detecting an unusual condition
    if (!isErrorCase) {
        start();    //main path
        process();
        cleanup();
        exit();
    } else {
        handleError();
    }
} else {
    handleUnusualCase(); //handling that unusual condition
}

In the code above,

  • Unusual condition detection is separated from their handling.
  • Main path is nested deeply.

Good

if (isUnusualCase) { //Guard Clause
    handleUnusualCase();
    return;
}

if (isErrorCase) { //Guard Clause
    handleError();
    return;
}

start();
process();
cleanup();
exit();

In contrast, the above code

  • deals with unusual conditions as soon as they are detected so that the reader doesn't have to remember them for long.
  • keeps the main path un-indented.

Guideline: Follow a Standard

Introduction

Can explain the need for following a standard

One essential way to improve code quality is to follow a consistent style. That is why software engineers follow a strict coding standard (aka style guide).

The aim of a coding standard is to make the entire code base look like it was written by one person. A coding standard is usually specific to a programming language and specifies guidelines such as the location of opening and closing braces, indentation styles and naming styles (e.g. whether to use Hungarian style, Pascal casing, Camel casing, etc.). It is important that the whole team/company use the same coding standard and that standard is not generally inconsistent with typical industry practices. If a company's coding standards is very different from what is used typically in the industry, new recruits will take longer to get used to the company's coding style.

💡 IDEs can help to enforce some parts of a coding standard e.g. indentation rules.

What is the recommended approach regarding coding standards?

c

What is the aim of using a coding standard? How does it help?

Basic

Can follow simple mechanical style rules

Learn basic guidelines of the Java coding standard (by OSS-Generic)

Consider the code given below:

import java.util.*;

public class Task {
    public static final String descriptionPrefix = "description: ";
    private String description;
    private boolean important;
    List<String> pastDescription = new ArrayList<>(); // a list of past descriptions

    public Task(String d) {
      this.description = d;
      if (!d.isEmpty())
          this.important = true;
    }

    public String getAsXML() { return "<task>"+description+"</task>"; }

    /**
     * Print the description as a string.
     */
    public void printingDescription(){ System.out.println(this); }

    @Override
    public String toString() { return descriptionPrefix + description; }
}

In what ways the code violate the basic guidelines (i.e., those marked with one ⭐️) of the OSS-Generic Java Coding Standard given here?

Here are three:

  • descriptionPrefix is a constant and should be named DESCRIPTION_PREFIX
  • method name printingDescription() should be named as printDescription()
  • boolean variable important should be named to sound boolean e.g., isImportant

There are many more.

Intermediate

Can follow intermediate style rules

Go through the provided Java coding standard and learn the intermediate style rules.

According to the given Java coding standard, which one of these is not a good name?

b

Explanation: checkWeight is an action. Naming variables as actions makes the code harder to follow. isWeightValid may be a better name.

Repeat the exercise in the panel below but also find violations of intermediate level guidelines.

Consider the code given below:

import java.util.*;

public class Task {
    public static final String descriptionPrefix = "description: ";
    private String description;
    private boolean important;
    List<String> pastDescription = new ArrayList<>(); // a list of past descriptions

    public Task(String d) {
      this.description = d;
      if (!d.isEmpty())
          this.important = true;
    }

    public String getAsXML() { return "<task>"+description+"</task>"; }

    /**
     * Print the description as a string.
     */
    public void printingDescription(){ System.out.println(this); }

    @Override
    public String toString() { return descriptionPrefix + description; }
}

In what ways the code violate the basic guidelines (i.e., those marked with one ⭐️) of the OSS-Generic Java Coding Standard given here?

Here are three:

  • descriptionPrefix is a constant and should be named DESCRIPTION_PREFIX
  • method name printingDescription() should be named as printDescription()
  • boolean variable important should be named to sound boolean e.g., isImportant

There are many more.

Here's one you are more likely to miss:

  • * Print the description as a string.* Prints the description as a string.

There are more.

Guideline: Name Well

Introduction

Can explain the need for good names in code

Proper naming improves the readability. It also reduces bugs caused by ambiguities regarding the intent of a variable or a method.

There are only two hard things in Computer Science: cache invalidation and naming things. -- Phil Karlton

Basic

Use Nouns for Things and Verbs for Actions

Can improve code quality using technique: use nouns for things and verbs for actions

Every system is built from a domain-specific language designed by the programmers to describe that system. Functions are the verbs of that language, and classes are the nouns. ― Robert C. Martin, Clean Code: A Handbook of Agile Software Craftsmanship

Use nouns for classes/variables and verbs for methods/functions.

Examples:

Name for a Bad Good
Class CheckLimit LimitChecker
method result() calculate()

Distinguish clearly between single-valued and multivalued variables.

Examples:

Good

Person student;
ArrayList<Person> students;

Good

student = Person('Jim')
students = [Person('Jim'), Person('Alice')]

Use Standard Words

Can improve code quality using technique: use standard words

Use correct spelling in names. Avoid 'texting-style' spelling. Avoid foreign language words, slang, and names that are only meaningful within specific contexts/times e.g. terms from private jokes, a TV show currently popular in your country

Intermediate

Use Name to Explain

Can improve code quality using technique: use name to explain

A name is not just for differentiation; it should explain the named entity to the reader accurately and at a sufficient level of detail.

Examples:

Bad Good
processInput() (what 'process'?) removeWhiteSpaceFromInput()
flag isValidInput
temp

If the name has multiple words, they should be in a sensible order.

Examples:

Bad Good
bySizeOrder() orderBySize()

Imagine going to the doctor's and saying "My eye1 is swollen"! Don’t use numbers or case to distinguish names.

Examples:

Bad Bad Good
value1, value2 value, Value originalValue, finalValue

Not Too Long, Not Too Short

Can improve code quality using technique: not too long, not too short

While it is preferable not to have lengthy names, names that are 'too short' are even worse. If you must abbreviate or use acronyms, do it consistently. Explain their full meaning at an obvious location.

Avoid Misleading Names

Can improve code quality using technique: avoid misleading names

Related things should be named similarly, while unrelated things should NOT.

Example: Consider these variables

  • colorBlack : hex value for color black
  • colorWhite : hex value for color white
  • colorBlue : number of times blue is used
  • hexForRed : : hex value for color red

This is misleading because colorBlue is named similar to colorWhite and colorBlack but has a different purpose while hexForRed is named differently but has very similar purpose to the first two variables. The following is better:

  • hexForBlack hexForWhite hexForRed
  • blueColorCount

Avoid misleading or ambiguous names (e.g. those with multiple meanings), similar sounding names, hard-to-pronounce ones (e.g. avoid ambiguities like "is that a lowercase L, capital I or number 1?", or "is that number 0 or letter O?"), almost similar names.

Examples:

Bad Good Reason
phase0 phaseZero Is that zero or letter O?
rwrLgtDirn rowerLegitDirection Hard to pronounce
right left wrong rightDirection leftDirection wrongResponse right is for 'correct' or 'opposite of 'left'?
redBooks readBooks redColorBooks booksRead red and read (past tense) sounds the same
FiletMignon egg If the requirement is just a name of a food, egg is a much easier to type/say choice than FiletMignon

Guideline: Avoid Unsafe Shortcuts

Introduction

Can explain the need for avoiding error-prone shortcuts

It is safer to use language constructs in the way they are meant to be used, even if the language allows shortcuts. Some such coding practices are common sources of bugs. Know them and avoid them.

Basic

Use the Default Branch

Can improve code quality using technique: use the default branch

Always include a default branch in case statements.

Furthermore, use it for the intended default action and not just to execute the last option. If there is no default action, you can use the 'default' branch to detect errors (i.e. if execution reached the default branch, throw an exception). This also applies to the final else of an if-else construct. That is, the final else should mean 'everything else', and not the final option. Do not use else when an if condition can be explicitly specified, unless there is absolutely no other possibility.

Bad

if (red) print "red";
else print "blue";

Good

if (red) print "red";
else if (blue) print "blue";
else error("incorrect input");

Don't Recycle Variables or Parameters

Can improve code quality using technique: don't recycle variables or parameters

  • Use one variable for one purpose. Do not reuse a variable for a different purpose other than its intended one, just because the data type is the same.
  • Do not reuse formal parameters as local variables inside the method.

Bad

double computeRectangleArea(double length, double width) {
    length = length * width;
    return length;
}

Good

double computeRectangleArea(double length, double width) {
    double area;
    area = length * width;
    return area;
}

Avoid Empty Catch Blocks

Can improve code quality using technique: avoid empty catch blocks

Never write an empty catch statement. At least give a comment to explain why the catch block is left empty.

Delete Dead Code

Can improve code quality using technique: delete dead code

We all feel reluctant to delete code we have painstakingly written, even if we have no use for that code any more ("I spent a lot of time writing that code; what if we need it again?"). Consider all code as baggage you have to carry; get rid of unused code the moment it becomes redundant. If you need that code again, simply recover it from the revision control tool you are using. Deleting code you wrote previously is a sign that you are improving.

Intermediate

Minimise Scope of Variables

Can improve code quality using technique: minimise scope of variables

Minimize global variables. Global variables may be the most convenient way to pass information around, but they do create implicit links between code segments that use the global variable. Avoid them as much as possible.

Define variables in the least possible scope. For example, if the variable is used only within the if block of the conditional statement, it should be declared inside that if block.

The most powerful technique for minimizing the scope of a local variable is to declare it where it is first used. -- Effective Java, by Joshua Bloch

Resources:

Minimise Code Duplication

Can improve code quality using technique: minimise code duplication

Code duplication, especially when you copy-paste-modify code, often indicates a poor quality implementation. While it may not be possible to have zero duplication, always think twice before duplicating code; most often there is a better alternative.

This guideline is closely related to the DRY Principle.

Supplmentary → Principles →

DRY Principle

DRY (Don't Repeat Yourself) Principle: Every piece of knowledge must have a single, unambiguous, authoritative representation within a system The Pragmatic Programmer, by Andy Hunt and Dave Thomas

This principle guards against duplication of information.

The functionality implemented twice is a violation of the DRY principle even if the two implementations are different.

The value a system-wide timeout being defined in multiple places is a violation of DRY.

Guideline: Comment Minimally, but Sufficiently

Introduction

Can explain the need for commenting minimally but sufficiently

Good code is its own best documentation. As you’re about to add a comment, ask yourself, ‘How can I improve the code so that this comment isn’t needed?’ Improve the code and then document it to make it even clearer. --Steve McConnell, Author of Clean Code

Some think commenting heavily increases the 'code quality'. This is not so. Avoid writing comments to explain bad code. Improve the code to make it self-explanatory.

Basic

Do Not Repeat the Obvious

Can improve code quality using technique: do not repeat the obvious

If the code is self-explanatory, refrain from repeating the description in a comment just for the sake of 'good documentation'.

Bad

// increment x
x++;

//trim the input
trimInput();

Write to the Reader

Can improve code quality using technique: write to the reader

Do not write comments as if they are private notes to self. Instead, write them well enough to be understood by another programmer. One type of comments that is almost always useful is the header comment that you write for a class or an operation to explain its purpose.

Examples:

Bad Reason: this comment will only make sense to the person who wrote it

// a quick trim function used to fix bug I detected overnight
void trimInput(){
    ....
}

Good

/** Trims the input of leading and trailing spaces */
void trimInput(){
    ....
}

Bad Reason: this comment will only make sense to the person who wrote it

# a quick trim function used to fix bug I detected overnight
def trim_input():
    ...

Good

def trim_input():
"""Trim the input of leading and trailing spaces"""
    ...

Intermediate

Explain WHAT and WHY, not HOW

Can improve code quality using technique: explain what and why, not how

Comments should explain what and why aspect of the code, rather than the how aspect.

What : The specification of what the code supposed to do. The reader can compare such comments to the implementation to verify if the implementation is correct

Example: This method is possibly buggy because the implementation does not seem to match the comment. In this case the comment could help the reader to detect the bug.

/** Removes all spaces from the {@code input} */
void compact(String input){
    input.trim();
}

Why : The rationale for the current implementation.

Example: Without this comment, the reader will not know the reason for calling this method.

// Remove spaces to comply with IE23.5 formatting rules
compact(input);

How : The explanation for how the code works. This should already be apparent from the code, if the code is self-explanatory. Adding comments to explain the same thing is redundant.

Example:

Bad Reason: Comment explains how the code works.

// return true if both left end and right end are correct or the size has not incremented
return (left && right) || (input.size() == size);

Good Reason: Code refactored to be self-explanatory. Comment no longer needed.


boolean isSameSize = (input.size() == size) ;
return (isLeftEndCorrect && isRightEndCorrect) || isSameSize;

null

B. Depth and completeness of your feature

Evaluates: How good is your Quality Assurance?

Based on: 1. your test code 2. our own manual testing 3. your performance in the v1.4 Practical Exam, 4. bugs found during PE.

Relevant: [Admin Deliverables → Practical Exam ]

 

Objectives:

  • Evaluate your manual testing skills, product evaluation skills, effort estimation skills
  • Peer-evaluate your product design , implementation effort , documentation quality

When, where: Week 13 lecture

Grading:

  • Your performance in the practical exam will be considered for your final grade (under the QA category and under Implementation category, about 10 marks in total).
  • You will be graded based on your effectiveness as a tester (e.g., the percentage of the bugs you found, the nature of the bugs you found) and how far off your evaluation/estimates are from the evaluator consensus. Explanation: we understand that you have limited expertise in this area; hence, we penalize only if your inputs don't seem to be based on a sincere effort to test/evaluate.
  • The bugs found in your product by others will affect your v1.4 marks. You will be given a chance to reject false-positive bug reports.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During:

  1. Take note of your team to test. It will be given to you by the teaching team (distributed via IVLE gradebook).
  2. Download from IVLE all files submitted by the team (i.e. jar file, User Guide, Developer Guide, and Project Portfolio Pages) into an empty folder.
  3. [~40 minutes] Test the product and report bugs as described below:
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

  1. [~50 minutes] Evaluate the following aspects. Note down your evaluation in a hard copy (as a backup). Submit via TEAMMATES.

    • A. Cohesiveness of product features []: Do the features fit together and match the stated target user and the value proposition?

      • unable to judge: You are unable to judge this aspect for some reason.
      • low: One of these
        • target user is too general  i.e. wider than AB4
        • target user and/or value proposition not clear from the user guide
        • features don't seem to fit together for the most part
      • medium: Some features fit together but some don't.
      • high: All features fit together but the features are not very high value to the target user.
      • excellent: The target user is clearly defined (not too general) and almost all new features are of high-value to the target user. i.e. the product is very attractive to the target user.
    • B. Quality of user docs []: Evaluate based on the parts of the user guide written by the person, as reproduced in the project portfolio. Evaluate from an end-user perspective.

      • unable to judge: Less than 1 page worth of UG content written by the student.
      • low: Hard to understand, often inaccurate or missing important information.
      • medium: Needs some effort to understand; some information is missing.
      • high: Mostly easy to follow. Only a few areas need improvements.
      • excellent: Easy to follow and accurate. Just enough information, visuals, examples etc. (not too much either). Understandable to the target end user.
    • C. Quality of developer docs []: Evaluate based on the developer docs cited/reproduced in the respective project portfolio page. Evaluate from the perspective of a new developer trying to understand how the features are implemented.

      • unable to judge: One of these
        • less than 0.5 pages worth of content.
        • other problems in the document  e.g. looks like included wrong content.
      • low: One of these
        • Very small amount of content (i.e., 0.5 - 1 page).
        • Hardly any use to the reader (i.e., content doesn't make much sense or redundant).
        • Uses ad-hoc diagrams where UML diagrams could have been used instead.
        • Multiple notation errors in UML diagrams.
      • medium: Some diagrams, some descriptions, but does not help the reader that much  e.g. overly complicated diagrams.
      • high: Enough diagrams (at lest two kinds of UML diagrams used) and enough descriptions (about 2 pages worth) but explanations are not always easy to follow.
      • excellent: Easy to follow. Just enough information (not too much). Minimum repetition of content/diagrams. Good use of diagrams to complement text descriptions. Easy to understand diagrams with just enough details rather than very complicated diagrams that are hard to understand.
    • D. Depth of feature []: Evaluate the feature done by the student for difficulty, depth, and completeness. Note: examples given below assume that AB4 did not have the commands edit, undo, and redo.

      • unable to judge: You are unable to judge this aspect for some reason.
      • low : An easy feature  e.g. make the existing find command case insensitive.
      • medium : Moderately difficult feature, barely acceptable implementation  e.g. an edit command that requires the user to type all fields, even the ones that are not being edited.
      • high: One of the below
        • A moderately difficult feature but fully implemented  e.g. an edit command that allows editing any field.
        • A difficult feature with a reasonable implementation but some aspects are not covered  undo/redo command that only allows a single undo/redo.
      • excellent: A difficult feature, all reasonable aspects are fully implemented  undo/redo command that allows multiple undo/redo.
    • E. Amount of work []: Evaluate the amount of work, on a scale of 0 to 30.

      • Consider this PR (history command) as 5 units of effort which means this PR (undo/redo command) is about 15 points of effort. Given that 30 points matches an effort twice as that needed for the undo/redo feature (which was given as an example of an A grade project), we expect most students to be have efforts lower than 20.
      • Consider the main feature only. Exclude GUI inputs, but consider GUI outputs of the feature. Count all implementation/testing/documentation work as mentioned in that person's PPP. Also look at the actual code written by the person. We understand that it is not possible to know exactly which part of the code is for the main feature; make a best-guess judgement call based on the available info.
      • Do not give a high value just to be nice. If your estimate is wildly inaccurate, it means you are unable to estimate the effort required to implement a feature in a project that you are supposed to know well at this point. You will lose marks if that is the case.

Processing PE Bug Reports:

There will be a review period for you to respond to the bug reports you received.

Duration: The review period will start around 1 day after the PE (exact time to be announced) and will last until the following Wednesday midnight. However, you are recommended to finish this task ASAP, to minimize cutting into your exam preparation work.

Bug reviewing is recommended to be done as a team as some of the decisions need team consensus.

Instructions for Reviewing Bug Reports

  • First, don't freak out if there are lot of bug reports. Many can be duplicates and some can be false positives. In any case, we anticipate that all of these products will have some bugs and our penalty for bugs is not harsh. Furthermore, it depends on the severity of the bug. Some bug may not even be penalized.

  • Do not edit the subject or the description. Do not close bug reports. Your response (if any) should be added as a comment.

  • If the bug is reported multiple times, mark all copies EXCEPT one as duplicates using the duplicate tag (if the duplicates have different severity levels, you should keep the one with the highest severity). In addition, use this technique to indicate which issue they are duplicates of. Duplicates can be omitted from processing steps given below.

  • If a bug seems to be for a different product (i.e. wrongly assigned to your team), let us know (email prof).

  • Decide if it is a real bug and apply ONLY one of these labels.

Response Labels:

  • response.Accepted: You accept it as a bug.
  • response.Rejected: What tester treated as a bug is in fact the expected behavior. The penalty for rejecting a bug using an unjustifiable explanation is higher than the penalty if the same bug was accepted. You can reject bugs that you inherited from AB4.
  • response.CannotReproduce: You are unable to reproduce the behavior reported in the bug after multiple tries.
  • response.IssueUnclear: The issue description is not clear.
  • If applicable, decide the type of bug. Bugs without type- are considered type-FunctionalityBug by default (which are liable to a heavier penalty):

Bug Type Labels:

  • type-FunctionalityBug : the bug is a flaw in how the product works.
  • type-DocumentationBug : the bug is in the documentation.
  • If you disagree with the original severity assigned to the bug, you may change it to the correct level, in which case add a comment justifying the change. All such changes will be double-checked by the teaching team and unreasonable lowering of severity will be penalized extra.:

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Decide who should fix the bug. Use the Assignees field to assign the issue to that person(s). There is no need to actually fix the bug though. It's simply an indication/acceptance of responsibility. If there is no assignee, we will distribute the penalty for that bug (if any) among all team members.

  • Add an explanatory comment explaining your choice of labels and assignees.

  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.

Evaluates: How good are the sections you wrote for the user guide and the developer guide?

Based on: the relevant sections of your project portfolio. Criteria considered:

  • Explanation should be clear and written to match the audience.
  • Good use of visuals to complement text.
  • Use of correct UML notations (where applicable)

A. Process:

Evaluates: How well you did in project management related aspects of the project, as an individual and as a team

Based on: Supervisor observations of project milestones and GitHub data.

Milestones need to be reached the midnight before of the tutorial for it to be counted as achieved. To get a good grade for this aspect, achieve at least 60% of the recommended milestone progress.

Other criteria:

  • Good use of GitHub milestones
  • Good use of GitHub release mechanism
  • Good version control, based on the repo
  • Reasonable attempt to use the forking workflow
  • Good task definition, assignment and tracking, based on the issue tracker
  • Good use of buffers (opposite: everything at the last minute)
  • Project done iteratively and incrementally (opposite: doing most of the work in one big burst)

B. Team-based tasks:

Evaluates: how much you contributed to common team-based tasksteam-based tasks

Based on: peer evaluations and tutor observations

Relevant: [Admin Project Scope → Examples of team tasks ]

 

Here is a non-exhaustive list of team-tasks:

  1. Necessary general code enhancements e.g.,
    1. Work related to renaming the product
    2. Work related to changing the product icon
    3. Morphing the product into a different product
  2. Setting up the GitHub, Travis, AppVeyor, etc.
  3. Maintaining the issue tracker
  4. Release management
  5. Updating user/developer docs that are not specific to a feature  e.g. documenting the target user profile
  6. Incorporating more useful tools/libraries/frameworks into the product or the project workflow (e.g. automate more aspects of the project workflow using a GitHub plugin)



Project: Supervision

Your tutor will serve as your project supervisor too.

The supervisor's main job is to observe, facilitate self/peer learning, evaluate, and give feedback.

Tutorial time is the main avenue for meeting your supervisor. In addition, you can meet the supervisor before/after the tutorial, or any other time, as many times you need, subject to availability in his/her schedule.

Note that it is not the supervisor’s job to chase you down and give help. It is up to you to get as much feedback from the as you need. You are free to request more feedback from the supervisor as necessary. Similarly, it is not the job of the supervisor to lead your project to success.



Peer Evaluations

We use the TEAMMATES online peer evaluation system to conduct several rounds of peer-evaluations. All peer evaluations will be taken into account when determining your participation marks. The system also allows you to give anonymous feedback to your teammates.

Extra Requirements: [considered for participation marks]

  • Submitting peer evaluations is compulsory. If you routinely miss submitting peer evaluations, you can lose participation marks.
  • 💡 TEAMMATES normally allows students to access it without using Google login. In this module, we encourage (but not require) you to login to TEAMMATES using your Google account and complete your profile with a suitable profile photo. Reason: CS2103/T is a big class. This profile helps us to remember you better, even after the module is over.
 
  • The purpose of the profile photo is for the teaching team to identify you. Therefore, you should choose a recent individual photo showing your face clearly (i.e., not too small) -- somewhat similar to a passport photo. Some examples can be seen in the 'Teaching team' page. Given below are some examples of good and bad profile photos.

  • If you are uncomfortable posting your photo due to security reasons, you can post a lower resolution image so that it is hard for someone to misuse that image for fraudulent purposes. If you are concerned about privacy, you can request permission to omit your photo from the page by writing to prof.

Peer evaluation criteria: professional conduct

  • Professional Communication :
    • Communicates sufficiently and professionally. e.g. Does not use offensive language or excessive slang in project communications.
    • Responds to communication from team members in a timely manner (e.g. within 24 hours).
  • Punctuality: Does not cause others to waste time or slow down project progress by frequent tardiness.
  • Dependability: Promises what can be done, and delivers what was promised.
  • Effort: Puts in sufficient effort to, and tries their best to keep up with the module/project pace. Seeks help from others when necessary.
  • Quality: Does not deliver work products that seem to be below the student's competence level i.e. tries their best to make the work product as high quality as possible within her competency level.
  • Meticulousness:
    • Rarely overlooks submission requirements.
    • Rarely misses compulsory module activities such as pre-module survey.
  • Teamwork: How willing are you to act as part of a team, contribute to team-level tasks, adhere to team decisions, etc.

Peer evaluation criteria: competency

  • Technical Competency: Able to gain competency in all the required tools and techniques.
  • Mentoring skills: Helps others when possible. Able to mentor others well.
  • Communication skills: Able to communicate (written and spoken) well. Takes initiative in discussions.

Giving constructive feedback to others is a valuable skill for software engineers. It is also an intended learning outcome of this module. Half-hearted/trivial feedback will not earn participation marks.

Here are some things to keep in mind:

  • Assume you are giving feedback to a colleague, not a friend. Keep the tone of your feedback reasonably professional. Do not use offensive language or slang.
  • The feedback should be honest and consistent. Giving positive qualitative feedback (e.g. Thanks for all the hard work! and negative ratings (e.g. Equal share - 40%) to the same team member is not being honest.
  • State your expectations early. All too often students give positive/neutral feedback early (hoping that the team member will improve later) and trash the team member in the final evaluation (because the he/she did not improve as expected). However, this could be confusing to the recipient. It is better to give negative feedback early so that the team member gets a clear signal that he/she needs to improve.


Tools


Learning Management System: This module website is the main source of information for the module. In addition, we use IVLE for some things (e.g., announcements, file submissions, grade book, ...) and LumiNUS for lecture webcasts (reason: IVLE no longer supports webcasts).

Collaboration platform: You are required to use GitHub as the hosting and collaboration platform of your project (i.e., to hold the Code repository, Issue Tracker, etc.). See Appendix E for more info on how to setup and use GitHub for your project.

Communication: Keeping a record of communications among your team can help you, and us, in many ways. We encourage you to do at least some of the project communication in written medium (e.g., GitHub Issue Tracker) to practice how to communicate technical things in written form.

  • Instead of the IVLE forum, we encourage you to post your questions/suggestions in this github/nus-cs2103-AY1819S1/forum.
  • Alternatively, you can post in our slack channel https://nus-cs2103-ay1819s1.slack.com. We encourage you all to join the slack channel (you'll need to use an email address ending in @nus.edu.sg, @comp.nus.edu.sg, @u.nus.edu.sg or @u.nus.edu to join this channel).
  • Note that slack is useful for quick chats while issue tracker is useful for longer-running conversations.
  • You are encouraged to use channels with a wider audience (common channel in slack, GitHub issue tracker) for module-related communication as much as possible, rather than private channels such as private slack/FB messages or direct emails. Rationale: more classmates can benefit from the discussions.

IDE: You are recommended to use Intellij IDEA for module-related programming work. You may use the community edition (free) or the ultimate edition (free for students). While the use of Intellij is not compulsory, note that module materials are optimized for Intellij. Use other IDEs at your own risk.

Revision control: You are required to use Git. Other revision control software are not allowed.
The recommended GUI client for Git is SourceTree (which comes bundled with Git), but you may use any other, or none.

Analyzing code authorship: We use a custom-built tool called RepoSense for extracting code written by each person.

In previous semesters we asked students to annotate all their code using special @@author tags so that we can extract each student's code for grading. This semester, we are trying out a new tool called RepoSense that is expected to reduce the need for such tagging, and also make it easier for you to see (and learn from) code written by others.

Figure: RepoSense Report Features

1. View the current status of code authorship data:

  • The report generated by the tool is available at Project Code Dashboard (BETA). The feature that is most relevant to you is the Code Panel (shown on the right side of the screenshot above). It shows the code attributed to a given author. You are welcome to play around with the other features (they are still under development and will not be used for grading this semester).
  • Click on your name to load the code attributed to you (based on Git blame/log data) onto the code panel on the right.
  • If the code shown roughly matches the code you wrote, all is fine and there is nothing for you to do.

2. If the code does not match:

  • Here are the possible reasons for the code shown not to match the code you wrote:

    • the git username in some of your commits does not match your GitHub username (perhaps you missed our instructions to set your Git username to match GitHub username earlier in the project, or GitHub did not honor your Git username for some reason)
    • the actual authorship does not match the authorship determined by git blame/log e.g., another student touched your code after you wrote it, and Git log attributed the code to that student instead
  • In those cases,

    • Install RepoSense (see the Getting Started section of the RepoSense User Guide)
    • Use the two methods described in the RepoSense User Guide section Configuring a Repo to Provide Additional Data to RepoSense to provide additional data to the authorship analysis to make it more accurate.
    • If you add a config.json file to your repo (as specified by one of the two methods),
      • Please use the template json file given in the module website so that your display name matches the name we expect it to be.
      • If your commits have multiple author names, specify all of them e.g., "authorNames": ["theMyth", "theLegend", "theGary"]
      • Update the line config.json in the .gitignore file of your repo as /config.json so that it ignores the config.json produced by the app but not the _reposense/config.json.
    • If you add @@author annotations, please follow the guidelines below:

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

  • After you are satisfied with the new results (i.e., results produced by running RepoSense locally), push the config.json file you added and/or the annotated code to your repo. We'll use that information the next time we run RepoSense (we run it at least once a week).
  • If you choose to annotate code, please annotate code chunks not smaller than a method. We do not grade code snippets smaller than a method.
  • If you encounter any problem when doing the above or if you have questions, please post in the forum.

We recommend you ensure your code is RepoSense-compatible by v1.3



Grade Breakdown

Relevant: [Admin Participation Marks ]

 

10 marks allocated for participation can be earned in the following ways (there are 30+ available marks to choose from):

  • Good peer ratings
    • Criteria for professional conduct (1 mark for each criterion, max 7)
    • Competency criteria (2 marks for each, max 6)
  • Quizzes
    • In-lecture quizzes (1 each, max 10 marks)
    • Post-lecture quizzes (0.5 each, max 5 marks)
  • Module admin tasks done on time and as instructed
    • Peer evaluations (1 marks each, max 3)
    • Pre-module survey (1 marks)
  • Enhanced AB1-AB3 (2 mark each, max 6 marks)

Relevant: [Admin Peer Evaluations → Criteria ]

 

Peer evaluation criteria: professional conduct

  • Professional Communication :
    • Communicates sufficiently and professionally. e.g. Does not use offensive language or excessive slang in project communications.
    • Responds to communication from team members in a timely manner (e.g. within 24 hours).
  • Punctuality: Does not cause others to waste time or slow down project progress by frequent tardiness.
  • Dependability: Promises what can be done, and delivers what was promised.
  • Effort: Puts in sufficient effort to, and tries their best to keep up with the module/project pace. Seeks help from others when necessary.
  • Quality: Does not deliver work products that seem to be below the student's competence level i.e. tries their best to make the work product as high quality as possible within her competency level.
  • Meticulousness:
    • Rarely overlooks submission requirements.
    • Rarely misses compulsory module activities such as pre-module survey.
  • Teamwork: How willing are you to act as part of a team, contribute to team-level tasks, adhere to team decisions, etc.

Peer evaluation criteria: competency

  • Technical Competency: Able to gain competency in all the required tools and techniques.
  • Mentoring skills: Helps others when possible. Able to mentor others well.
  • Communication skills: Able to communicate (written and spoken) well. Takes initiative in discussions.

Relevant: [Admin Exams ]

 

There is no midterm.

The final exam has two parts:

  • Part 1: MCQ questions (1 hour, 20 marks)
  • Part 2: Essay questions (1 hour, 20 marks)

Both papers will be given to you at the start but you need to answer Part 1 first (i.e. MCQ paper). It will be collected 1 hour after the exam start time (even if arrived late for the exam). You are free to start part 2 early if you finish Part 1 early.

Final Exam: Part 1 (MCQ)

Each MCQ question gives you a statement to evaluate.

An example statement

Testing is a Q&A activity

Unless stated otherwise, the meaning of answer options are
A: Agree. If the question has multiple statements, agree with all of them.
B: Disagree. If the question has multiple statements, disagree with at least one of them
C, D, E: Not used

Number of questions: 100

Note that you have slightly more than ½ minute for each question, which means you need to go through the questions fairly quickly.

Given the fast pace required by the paper, to be fair to all students, you will not be allowed to clarify doubts about questions (in Part 1) by talking to invigilators.

  • If a question is not clear, you can circle the question number in the question paper and write your doubt in the question paper, near that question.
  • If your doubt is justified (e.g. there is a typo in the question) or if many students found the question to be unclear, the examiner may decide to omit that question from grading.

Questions in Part 1 are confidential. You are not allowed to reveal Part 1 content to anyone after the exam. All pages of the assessment paper are to be returned at the end of the exam.

You will be given OCR forms (i.e., bubble sheets) to indicate your answers for Part 1. As each OCR form can accommodate only 50 answers, you will be given 2 OCR forms. Indicate your student number in both OCR forms.

To save space, we use the following notation in MCQ question. [x | y | z] means ‘x and z, but not y’

SE is [boring | useful | fun] means SE is not boring AND SE is useful AND SE is fun.

Consider the following statement:

  • IDEs can help with [writing | debugging | testing] code.

The correct response for it is Disagree because IDEs can help with all three of the given options, not just writing and testing.

Some questions will use underlines or highlighting to draw your attention to a specific part of the question. That is because those parts are highly relevant to the answer and we don’t want you to miss the relevance of that part.

Consider the statement below:

Technique ABC can be used to generate more test cases.

The word can is underlined because the decision you need to make is whether the ABC can or cannot be used to generate more test cases; the decision is not whether ABC can be used to generate more or better test cases.

Markers such as the one given below appears at left margin of the paper to indicate where the question corresponds to a new column in the OCR form. E.g. questions 11, 21, 31, etc. (a column has 10 questions). Such markers can help you to detect if you missed a question in the previous 10 questions. You can safely ignore those markers if you are not interested in making use of that additional hint.


Some questions have tags e.g., the question below has a tag JAVA. These tags provide additional context about the question. In the example below, the tag indicates that the code given in the question is Java code.


The exam paper is open-book: you may bring any printed or written materials to the exam in hard copy format. However, given the fast pace required by Part 1, you will not have time left to refer notes during that part of the exam.

💡 Mark the OCR form as you go, rather than planning to transfer your answers to the OCR form near the end.  Reason: Given there are 100 questions, it will be hard to estimate how much time you need to mass-transfer all answers to OCR forms.

💡 Write the answer in the exam paper as well when marking it in the OCR form.  Reason: It will reduce the chance of missing a question. Furthermore, in case you missed a question, it will help you correct the OCR form quickly.

💡 We have tried to avoid deliberately misleading/tricky questions. If a question seems to take a very long time to figure out, you are probably over-thinking it.

You will be given a practice exam paper to familiarize yourself with this slightly unusual exam format.

Final Exam: Part 2 (Essay)

Unlike in part 1, you can ask invigilators for clarifications if you found a question to be unclear in part 2.

Yes, you may use pencils when answering part 2.

Relevant: [Admin Project Assessment ]

 

Note that project grading is not competitive (not bell curved). CS2103T projects will be assessed separately from CS2103 projects. This is to account for the perceived difference in workload. Given below is the marking scheme.

Total: 50 marks ( 40 individual marks + 10 team marks)

Evaluates: How well do your features fit together to form a cohesive product (not how many features or how big the features are)?

Based on: user guide and the product demo. The quality of the demo will be factored in as well.

💡 Feature that fit well with the other features will earn more marks.

Evaluates:

A. Code quality/quantity:

How good your implementation is, in terms of the quality and the quantity of the code you have written yourself.

Based on: an inspection of the parts of the code you claim as written by you.

  • Ensure your code has at least some evidence of these (see here for more info)

    • logging
    • exceptions
    • assertions
    • defensive coding
  • Ensure there are no coding standard violations  e.g. all boolean variables/methods sounds like booleans. Checkstyle can prevent only some coding standard violations; others need to be checked manually.

  • Ensure SLAP is applied at a reasonable level. Long methods or deeply-nested code are symptoms of low-SLAP may be counted against your code quality.

  • Reduce code duplications  i.e. if there multiple blocks of code that vary only in minor ways, try to extract out similarities into one place, especially in test code.

  • In addition, try to apply as many of the code quality guidelines covered in the module as much as you can.

 

Code Quality

Introduction

Basic

Can explain the importance of code quality

Always code as if the person who ends up maintaining your code will be a violent psychopath who knows where you live. -- Martin Golding

Production code needs to be of high quality . Given how the world is becoming increasingly dependent of software, poor quality code is something we cannot afford to tolerate.

Code being used in an actual product with actual users

Guideline: Maximise Readability

Introduction

Can explain the importance of readability

Programs should be written and polished until they acquire publication quality. --Niklaus Wirth

Among various dimensions of code quality, such as run-time efficiency, security, and robustness, one of the most important is understandability. This is because in any non-trivial software project, code needs to be read, understood, and modified by other developers later on. Even if we do not intend to pass the code to someone else, code quality is still important because we all become 'strangers' to our own code someday.

The two code samples given below achieve the same functionality, but one is easier to read.

     

Bad

int subsidy() {
    int subsidy;
    if (!age) {
        if (!sub) {
            if (!notFullTime) {
                subsidy = 500;
            } else {
                subsidy = 250;
            }
        } else {
            subsidy = 250;
        }
    } else {
        subsidy = -1;
    }
    return subsidy;
}

  

Good

int calculateSubsidy() {
    int subsidy;
    if (isSenior) {
        subsidy = REJECT_SENIOR;
    } else if (isAlreadySubsidised) {
        subsidy = SUBSIDISED_SUBSIDY;
    } else if (isPartTime) {
        subsidy = FULLTIME_SUBSIDY * RATIO;
    } else {
        subsidy = FULLTIME_SUBSIDY;
    }
    return subsidy;
}

     

Bad

def calculate_subs():
    if not age:
        if not sub:
            if not not_fulltime:
                subsidy = 500
            else:
                subsidy = 250
        else:
            subsidy = 250
    else:
        subsidy = -1
    return subsidy

  

Good

def calculate_subsidy():
    if is_senior:
        return REJECT_SENIOR
    elif is_already_subsidised:
        return SUBSIDISED_SUBSIDY
    elif is_parttime:
        return FULLTIME_SUBSIDY * RATIO
    else:
        return FULLTIME_SUBSIDY

Basic

Avoid Long Methods

Can improve code quality using technique: avoid long methods

Be wary when a method is longer than the computer screen, and take corrective action when it goes beyond 30 LOC (lines of code). The bigger the haystack, the harder it is to find a needle.

Avoid Deep Nesting

Can improve code quality using technique: avoid deep nesting

If you need more than 3 levels of indentation, you're screwed anyway, and should fix your program. --Linux 1.3.53 CodingStyle

In particular, avoid arrowhead style code.

Example:

Avoid Complicated Expressions

Can improve code quality using technique: avoid complicated expressions

Avoid complicated expressions, especially those having many negations and nested parentheses. If you must evaluate complicated expressions, have it done in steps (i.e. calculate some intermediate values first and use them to calculate the final value).

Example:

Bad

return ((length < MAX_LENGTH) || (previousSize != length)) && (typeCode == URGENT);

Good


boolean isWithinSizeLimit = length < MAX_LENGTH;
boolean isSameSize = previousSize != length;
boolean isValidCode = isWithinSizeLimit || isSameSize;

boolean isUrgent = typeCode == URGENT;

return isValidCode && isUrgent;

Example:

Bad

return ((length < MAX_LENGTH) or (previous_size != length)) and (type_code == URGENT)

Good

is_within_size_limit = length < MAX_LENGTH
is_same_size = previous_size != length
is_valid_code = is_within_size_limit or is_same_size

is_urgent = type_code == URGENT

return is_valid_code and is_urgent

The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. -- Edsger Dijkstra

Avoid Magic Numbers

Can improve code quality using technique: avoid magic numbers

When the code has a number that does not explain the meaning of the number, we call that a magic number (as in “the number appears as if by magic”). Using a named constant makes the code easier to understand because the name tells us more about the meaning of the number.

Example:

     

Bad

return 3.14236;
...
return 9;

  

Good

static final double PI = 3.14236;
static final int MAX_SIZE = 10;
...
return PI;
...
return MAX_SIZE-1;

Note: Python does not have a way to make a variable a constant. However, you can use a normal variable with an ALL_CAPS name to simulate a constant.

     

Bad

return 3.14236
...
return 9

  

Good

PI = 3.14236
MAX_SIZE = 10
...
return PI
...
return MAX_SIZE-1

Similarly, we can have ‘magic’ values of other data types.

Bad

"Error 1432"  // A magic string!

Make the Code Obvious

Can improve code quality using technique: make the code obvious

Make the code as explicit as possible, even if the language syntax allows them to be implicit. Here are some examples:

  • [Java] Use explicit type conversion instead of implicit type conversion.
  • [Java, Python] Use parentheses/braces to show grouping even when they can be skipped.
  • [Java, Python] Use enumerations when a certain variable can take only a small number of finite values. For example, instead of declaring the variable 'state' as an integer and using values 0,1,2 to denote the states 'starting', 'enabled', and 'disabled' respectively, declare 'state' as type SystemState and define an enumeration SystemState that has values 'STARTING', 'ENABLED', and 'DISABLED'.

Intermediate

Structure Code Logically

Can improve code quality using technique: structure code logically

Lay out the code so that it adheres to the logical structure. The code should read like a story. Just like we use section breaks, chapters and paragraphs to organize a story, use classes, methods, indentation and line spacing in your code to group related segments of the code. For example, you can use blank lines to group related statements together. Sometimes, the correctness of your code does not depend on the order in which you perform certain intermediary steps. Nevertheless, this order may affect the clarity of the story you are trying to tell. Choose the order that makes the story most readable.

Do Not 'Trip Up' Reader

Can improve code quality using technique: do not 'trip up' reader

Avoid things that would make the reader go ‘huh?’, such as,

  • unused parameters in the method signature
  • similar things look different
  • different things that look similar
  • multiple statements in the same line
  • data flow anomalies such as, pre-assigning values to variables and modifying it without any use of the pre-assigned value

Practice KISSing

Can improve code quality using technique: practice kissing

As the old adage goes, "keep it simple, stupid” (KISS). Do not try to write ‘clever’ code. For example, do not dismiss the brute-force yet simple solution in favor of a complicated one because of some ‘supposed benefits’ such as 'better reusability' unless you have a strong justification.

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. --Brian W. Kernighan

Programs must be written for people to read, and only incidentally for machines to execute. --Abelson and Sussman

Avoid Premature Optimizations

Can improve code quality using technique: avoid premature optimizations

Optimizing code prematurely has several drawbacks:

  • We may not know which parts are the real performance bottlenecks. This is especially the case when the code undergoes transformations (e.g. compiling, minifying, transpiling, etc.) before it becomes an executable. Ideally, you should use a profiler tool to identify the actual bottlenecks of the code first, and optimize only those parts.
  • Optimizing can complicate the code, affecting correctness and understandability
  • Hand-optimized code can be harder for the compiler to optimize (the simpler the code, the easier for the compiler to optimize it). In many cases a compiler can do a better job of optimizing the runtime code if you don't get in the way by trying to hand-optimize the source code.

A popular saying in the industry is make it work, make it right, make it fast which means in most cases getting the code to perform correctly should take priority over optimizing it. If the code doesn't work correctly, it has no value on matter how fast/efficient it it.

Premature optimization is the root of all evil in programming. --Donald Knuth

Note that there are cases where optimizing takes priority over other things e.g. when writing code for resource-constrained environments. This guideline simply a caution that you should optimize only when it is really needed.

SLAP Hard

Can improve code quality using technique: SLAP hard

Avoid varying the level of abstraction within a code fragment. Note: The Productive Programmer (by Neal Ford) calls this the SLAP principle i.e. Single Level of Abstraction Per method.

Example:

Bad

readData();
salary = basic*rise+1000;
tax = (taxable?salary*0.07:0);
displayResult();

Good

readData();
processData();
displayResult();
 

Design → Design Fundamentals → Abstraction →

What

Abstraction is a technique for dealing with complexity. It works by establishing a level of complexity we are interested in, and suppressing the more complex details below that level.

The guiding principle of abstraction is that only details that are relevant to the current perspective or the task at hand needs to be considered. As most programs are written to solve complex problems involving large amounts of intricate details, it is impossible to deal with all these details at the same time. That is where abstraction can help.

Ignoring lower level data items and thinking in terms of bigger entities is called data abstraction.

Within a certain software component, we might deal with a user data type, while ignoring the details contained in the user data item such as name, and date of birth. These details have been ‘abstracted away’ as they do not affect the task of that software component.

Control abstraction abstracts away details of the actual control flow to focus on tasks at a simplified level.

print(“Hello”) is an abstraction of the actual output mechanism within the computer.

Abstraction can be applied repeatedly to obtain progressively higher levels of abstractions.

An example of different levels of data abstraction: a File is a data item that is at a higher level than an array and an array is at a higher level than a bit.

An example of different levels of control abstraction: execute(Game) is at a higher level than print(Char) which is at a higher than an Assembly language instruction MOV.

Abstraction is a general concept that is not limited to just data or control abstractions.

Some more general examples of abstraction:

  • An OOP class is an abstraction over related data and behaviors.
  • An architecture is a higher-level abstraction of the design of a software.
  • Models (e.g., UML models) are abstractions of some aspect of reality.

Advanced

Make the Happy Path Prominent

Can improve code quality using technique: make the happy path prominent

The happy path (i.e. the execution path taken when everything goes well) should be clear and prominent in your code. Restructure the code to make the happy path unindented as much as possible. It is the ‘unusual’ cases that should be indented. Someone reading the code should not get distracted by alternative paths taken when error conditions happen. One technique that could help in this regard is the use of guard clauses.

Example:

Bad

if (!isUnusualCase) {  //detecting an unusual condition
    if (!isErrorCase) {
        start();    //main path
        process();
        cleanup();
        exit();
    } else {
        handleError();
    }
} else {
    handleUnusualCase(); //handling that unusual condition
}

In the code above,

  • Unusual condition detection is separated from their handling.
  • Main path is nested deeply.

Good

if (isUnusualCase) { //Guard Clause
    handleUnusualCase();
    return;
}

if (isErrorCase) { //Guard Clause
    handleError();
    return;
}

start();
process();
cleanup();
exit();

In contrast, the above code

  • deals with unusual conditions as soon as they are detected so that the reader doesn't have to remember them for long.
  • keeps the main path un-indented.

Guideline: Follow a Standard

Introduction

Can explain the need for following a standard

One essential way to improve code quality is to follow a consistent style. That is why software engineers follow a strict coding standard (aka style guide).

The aim of a coding standard is to make the entire code base look like it was written by one person. A coding standard is usually specific to a programming language and specifies guidelines such as the location of opening and closing braces, indentation styles and naming styles (e.g. whether to use Hungarian style, Pascal casing, Camel casing, etc.). It is important that the whole team/company use the same coding standard and that standard is not generally inconsistent with typical industry practices. If a company's coding standards is very different from what is used typically in the industry, new recruits will take longer to get used to the company's coding style.

💡 IDEs can help to enforce some parts of a coding standard e.g. indentation rules.

What is the recommended approach regarding coding standards?

c

What is the aim of using a coding standard? How does it help?

Basic

Can follow simple mechanical style rules

Learn basic guidelines of the Java coding standard (by OSS-Generic)

Consider the code given below:

import java.util.*;

public class Task {
    public static final String descriptionPrefix = "description: ";
    private String description;
    private boolean important;
    List<String> pastDescription = new ArrayList<>(); // a list of past descriptions

    public Task(String d) {
      this.description = d;
      if (!d.isEmpty())
          this.important = true;
    }

    public String getAsXML() { return "<task>"+description+"</task>"; }

    /**
     * Print the description as a string.
     */
    public void printingDescription(){ System.out.println(this); }

    @Override
    public String toString() { return descriptionPrefix + description; }
}

In what ways the code violate the basic guidelines (i.e., those marked with one ⭐️) of the OSS-Generic Java Coding Standard given here?

Here are three:

  • descriptionPrefix is a constant and should be named DESCRIPTION_PREFIX
  • method name printingDescription() should be named as printDescription()
  • boolean variable important should be named to sound boolean e.g., isImportant

There are many more.

Intermediate

Can follow intermediate style rules

Go through the provided Java coding standard and learn the intermediate style rules.

According to the given Java coding standard, which one of these is not a good name?

b

Explanation: checkWeight is an action. Naming variables as actions makes the code harder to follow. isWeightValid may be a better name.

Repeat the exercise in the panel below but also find violations of intermediate level guidelines.

Consider the code given below:

import java.util.*;

public class Task {
    public static final String descriptionPrefix = "description: ";
    private String description;
    private boolean important;
    List<String> pastDescription = new ArrayList<>(); // a list of past descriptions

    public Task(String d) {
      this.description = d;
      if (!d.isEmpty())
          this.important = true;
    }

    public String getAsXML() { return "<task>"+description+"</task>"; }

    /**
     * Print the description as a string.
     */
    public void printingDescription(){ System.out.println(this); }

    @Override
    public String toString() { return descriptionPrefix + description; }
}

In what ways the code violate the basic guidelines (i.e., those marked with one ⭐️) of the OSS-Generic Java Coding Standard given here?

Here are three:

  • descriptionPrefix is a constant and should be named DESCRIPTION_PREFIX
  • method name printingDescription() should be named as printDescription()
  • boolean variable important should be named to sound boolean e.g., isImportant

There are many more.

Here's one you are more likely to miss:

  • * Print the description as a string.* Prints the description as a string.

There are more.

Guideline: Name Well

Introduction

Can explain the need for good names in code

Proper naming improves the readability. It also reduces bugs caused by ambiguities regarding the intent of a variable or a method.

There are only two hard things in Computer Science: cache invalidation and naming things. -- Phil Karlton

Basic

Use Nouns for Things and Verbs for Actions

Can improve code quality using technique: use nouns for things and verbs for actions

Every system is built from a domain-specific language designed by the programmers to describe that system. Functions are the verbs of that language, and classes are the nouns. ― Robert C. Martin, Clean Code: A Handbook of Agile Software Craftsmanship

Use nouns for classes/variables and verbs for methods/functions.

Examples:

Name for a Bad Good
Class CheckLimit LimitChecker
method result() calculate()

Distinguish clearly between single-valued and multivalued variables.

Examples:

Good

Person student;
ArrayList<Person> students;

Good

student = Person('Jim')
students = [Person('Jim'), Person('Alice')]

Use Standard Words

Can improve code quality using technique: use standard words

Use correct spelling in names. Avoid 'texting-style' spelling. Avoid foreign language words, slang, and names that are only meaningful within specific contexts/times e.g. terms from private jokes, a TV show currently popular in your country

Intermediate

Use Name to Explain

Can improve code quality using technique: use name to explain

A name is not just for differentiation; it should explain the named entity to the reader accurately and at a sufficient level of detail.

Examples:

Bad Good
processInput() (what 'process'?) removeWhiteSpaceFromInput()
flag isValidInput
temp

If the name has multiple words, they should be in a sensible order.

Examples:

Bad Good
bySizeOrder() orderBySize()

Imagine going to the doctor's and saying "My eye1 is swollen"! Don’t use numbers or case to distinguish names.

Examples:

Bad Bad Good
value1, value2 value, Value originalValue, finalValue

Not Too Long, Not Too Short

Can improve code quality using technique: not too long, not too short

While it is preferable not to have lengthy names, names that are 'too short' are even worse. If you must abbreviate or use acronyms, do it consistently. Explain their full meaning at an obvious location.

Avoid Misleading Names

Can improve code quality using technique: avoid misleading names

Related things should be named similarly, while unrelated things should NOT.

Example: Consider these variables

  • colorBlack : hex value for color black
  • colorWhite : hex value for color white
  • colorBlue : number of times blue is used
  • hexForRed : : hex value for color red

This is misleading because colorBlue is named similar to colorWhite and colorBlack but has a different purpose while hexForRed is named differently but has very similar purpose to the first two variables. The following is better:

  • hexForBlack hexForWhite hexForRed
  • blueColorCount

Avoid misleading or ambiguous names (e.g. those with multiple meanings), similar sounding names, hard-to-pronounce ones (e.g. avoid ambiguities like "is that a lowercase L, capital I or number 1?", or "is that number 0 or letter O?"), almost similar names.

Examples:

Bad Good Reason
phase0 phaseZero Is that zero or letter O?
rwrLgtDirn rowerLegitDirection Hard to pronounce
right left wrong rightDirection leftDirection wrongResponse right is for 'correct' or 'opposite of 'left'?
redBooks readBooks redColorBooks booksRead red and read (past tense) sounds the same
FiletMignon egg If the requirement is just a name of a food, egg is a much easier to type/say choice than FiletMignon

Guideline: Avoid Unsafe Shortcuts

Introduction

Can explain the need for avoiding error-prone shortcuts

It is safer to use language constructs in the way they are meant to be used, even if the language allows shortcuts. Some such coding practices are common sources of bugs. Know them and avoid them.

Basic

Use the Default Branch

Can improve code quality using technique: use the default branch

Always include a default branch in case statements.

Furthermore, use it for the intended default action and not just to execute the last option. If there is no default action, you can use the 'default' branch to detect errors (i.e. if execution reached the default branch, throw an exception). This also applies to the final else of an if-else construct. That is, the final else should mean 'everything else', and not the final option. Do not use else when an if condition can be explicitly specified, unless there is absolutely no other possibility.

Bad

if (red) print "red";
else print "blue";

Good

if (red) print "red";
else if (blue) print "blue";
else error("incorrect input");

Don't Recycle Variables or Parameters

Can improve code quality using technique: don't recycle variables or parameters

  • Use one variable for one purpose. Do not reuse a variable for a different purpose other than its intended one, just because the data type is the same.
  • Do not reuse formal parameters as local variables inside the method.

Bad

double computeRectangleArea(double length, double width) {
    length = length * width;
    return length;
}

Good

double computeRectangleArea(double length, double width) {
    double area;
    area = length * width;
    return area;
}

Avoid Empty Catch Blocks

Can improve code quality using technique: avoid empty catch blocks

Never write an empty catch statement. At least give a comment to explain why the catch block is left empty.

Delete Dead Code

Can improve code quality using technique: delete dead code

We all feel reluctant to delete code we have painstakingly written, even if we have no use for that code any more ("I spent a lot of time writing that code; what if we need it again?"). Consider all code as baggage you have to carry; get rid of unused code the moment it becomes redundant. If you need that code again, simply recover it from the revision control tool you are using. Deleting code you wrote previously is a sign that you are improving.

Intermediate

Minimise Scope of Variables

Can improve code quality using technique: minimise scope of variables

Minimize global variables. Global variables may be the most convenient way to pass information around, but they do create implicit links between code segments that use the global variable. Avoid them as much as possible.

Define variables in the least possible scope. For example, if the variable is used only within the if block of the conditional statement, it should be declared inside that if block.

The most powerful technique for minimizing the scope of a local variable is to declare it where it is first used. -- Effective Java, by Joshua Bloch

Resources:

Minimise Code Duplication

Can improve code quality using technique: minimise code duplication

Code duplication, especially when you copy-paste-modify code, often indicates a poor quality implementation. While it may not be possible to have zero duplication, always think twice before duplicating code; most often there is a better alternative.

This guideline is closely related to the DRY Principle.

Supplmentary → Principles →

DRY Principle

DRY (Don't Repeat Yourself) Principle: Every piece of knowledge must have a single, unambiguous, authoritative representation within a system The Pragmatic Programmer, by Andy Hunt and Dave Thomas

This principle guards against duplication of information.

The functionality implemented twice is a violation of the DRY principle even if the two implementations are different.

The value a system-wide timeout being defined in multiple places is a violation of DRY.

Guideline: Comment Minimally, but Sufficiently

Introduction

Can explain the need for commenting minimally but sufficiently

Good code is its own best documentation. As you’re about to add a comment, ask yourself, ‘How can I improve the code so that this comment isn’t needed?’ Improve the code and then document it to make it even clearer. --Steve McConnell, Author of Clean Code

Some think commenting heavily increases the 'code quality'. This is not so. Avoid writing comments to explain bad code. Improve the code to make it self-explanatory.

Basic

Do Not Repeat the Obvious

Can improve code quality using technique: do not repeat the obvious

If the code is self-explanatory, refrain from repeating the description in a comment just for the sake of 'good documentation'.

Bad

// increment x
x++;

//trim the input
trimInput();

Write to the Reader

Can improve code quality using technique: write to the reader

Do not write comments as if they are private notes to self. Instead, write them well enough to be understood by another programmer. One type of comments that is almost always useful is the header comment that you write for a class or an operation to explain its purpose.

Examples:

Bad Reason: this comment will only make sense to the person who wrote it

// a quick trim function used to fix bug I detected overnight
void trimInput(){
    ....
}

Good

/** Trims the input of leading and trailing spaces */
void trimInput(){
    ....
}

Bad Reason: this comment will only make sense to the person who wrote it

# a quick trim function used to fix bug I detected overnight
def trim_input():
    ...

Good

def trim_input():
"""Trim the input of leading and trailing spaces"""
    ...

Intermediate

Explain WHAT and WHY, not HOW

Can improve code quality using technique: explain what and why, not how

Comments should explain what and why aspect of the code, rather than the how aspect.

What : The specification of what the code supposed to do. The reader can compare such comments to the implementation to verify if the implementation is correct

Example: This method is possibly buggy because the implementation does not seem to match the comment. In this case the comment could help the reader to detect the bug.

/** Removes all spaces from the {@code input} */
void compact(String input){
    input.trim();
}

Why : The rationale for the current implementation.

Example: Without this comment, the reader will not know the reason for calling this method.

// Remove spaces to comply with IE23.5 formatting rules
compact(input);

How : The explanation for how the code works. This should already be apparent from the code, if the code is self-explanatory. Adding comments to explain the same thing is redundant.

Example:

Bad Reason: Comment explains how the code works.

// return true if both left end and right end are correct or the size has not incremented
return (left && right) || (input.size() == size);

Good Reason: Code refactored to be self-explanatory. Comment no longer needed.


boolean isSameSize = (input.size() == size) ;
return (isLeftEndCorrect && isRightEndCorrect) || isSameSize;

null

B. Depth and completeness of your feature

Evaluates: How good is your Quality Assurance?

Based on: 1. your test code 2. our own manual testing 3. your performance in the v1.4 Practical Exam, 4. bugs found during PE.

Relevant: [Admin Deliverables → Practical Exam ]

 

Objectives:

  • Evaluate your manual testing skills, product evaluation skills, effort estimation skills
  • Peer-evaluate your product design , implementation effort , documentation quality

When, where: Week 13 lecture

Grading:

  • Your performance in the practical exam will be considered for your final grade (under the QA category and under Implementation category, about 10 marks in total).
  • You will be graded based on your effectiveness as a tester (e.g., the percentage of the bugs you found, the nature of the bugs you found) and how far off your evaluation/estimates are from the evaluator consensus. Explanation: we understand that you have limited expertise in this area; hence, we penalize only if your inputs don't seem to be based on a sincere effort to test/evaluate.
  • The bugs found in your product by others will affect your v1.4 marks. You will be given a chance to reject false-positive bug reports.

Preparation:

  • Ensure that you can access the relevant issue tracker given below:
    -- for PE Dry Run (at v1.3): nus-cs2103-AY1819S1/pe-dry-run
    -- for PE (at v1.4): nus-cs2103-AY1819S1/pe (will open only near the actual PE)

  • Ensure you have access to a computer that is able to run module projects  e.g. has the right Java version.

  • Have a good screen grab tool with annotation features so that you can quickly take a screenshot of a bug, annotate it, and post in the issue tracker.

    • 💡 You can use Ctrl+V to paste a picture from the clipboard into a text box in GitHub issue tracker.
  • Charge your computer before coming to the PE session. The testing venue may not have enough charging points.

During:

  1. Take note of your team to test. It will be given to you by the teaching team (distributed via IVLE gradebook).
  2. Download from IVLE all files submitted by the team (i.e. jar file, User Guide, Developer Guide, and Project Portfolio Pages) into an empty folder.
  3. [~40 minutes] Test the product and report bugs as described below:
Testing instructions for PE and PE Dry Run
  • What to test:

    • PE Dry Run (at v1.3):
      • Test the product based on the User Guide (the UG is most likely accessible using the help command).
      • Do system testing first i.e., does the product work as specified by the documentation?. If there is time left, you can do acceptance testing as well i.e., does the product solve the problem it claims to solve?.
    • PE (at v1.4):
      • Test based on the Developer Guide (Appendix named Instructions for Manual Testing) and the User Guide. The testing instructions in the Developer Guide can provide you some guidance but if you follow those instructions strictly, you are unlikely to find many bugs. You can deviate from the instructions to probe areas that are more likely to have bugs.
      • Do system testing only i.e., verify actual behavior against documented behavior. Do not do acceptance testing.
  • What not to test:

    • Omit features that are driven by GUI inputs (e.g. buttons, menus, etc.) Reason: Only CLI-driven features can earn credit, as per given project constraints. Some features might have both a GUI-driven and CLI-driven ways to invoke them, in which case test only the CLI-driven way of invoking it.
    • Omit feature that existed in AB-4.
  • These are considered bugs:

    • Behavior differs from the User Guide
    • A legitimate user behavior is not handled e.g. incorrect commands, extra parameters
    • Behavior is not specified and differs from normal expectations e.g. error message does not match the error
    • Problems in the User Guide e.g., missing/incorrect info
  • Where to report bugs: Post bug in the following issue trackers (not in the team's repo):

  • Bug report format:

    • Post bugs as you find them (i.e., do not wait to post all bugs at the end) because the issue tracker will close exactly at the end of the allocated time.
    • Do not use team ID in bug reports. Reason: to prevent others copying your bug reports
    • Each bug should be a separate issue.
    • Write good quality bug reports; poor quality or incorrect bug reports will not earn credit.
    • Use a descriptive title.
    • Give a good description of the bug with steps to reproduce and screenshots.
    • Assign a severity to the bug report. Bug report without a priority label are considered severity.Low (lower severity bugs earn lower credit):

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • About posting suggestions:

    • PE Dry Run (at v1.3): You can also post suggestions on how to improve the product. 💡 Be diplomatic when reporting bugs or suggesting improvements. For example, instead of criticising the current behavior, simply suggest alternatives to consider.
    • PE (at v1.4): Do not post suggestions.
  • If the product doesn't work at all: If the product fails catastrophically e.g., cannot even launch, you can test the fallback team allocated to you. But in this case you must inform us immediately after the session so that we can send your bug reports to the correct team.

  1. [~50 minutes] Evaluate the following aspects. Note down your evaluation in a hard copy (as a backup). Submit via TEAMMATES.

    • A. Cohesiveness of product features []: Do the features fit together and match the stated target user and the value proposition?

      • unable to judge: You are unable to judge this aspect for some reason.
      • low: One of these
        • target user is too general  i.e. wider than AB4
        • target user and/or value proposition not clear from the user guide
        • features don't seem to fit together for the most part
      • medium: Some features fit together but some don't.
      • high: All features fit together but the features are not very high value to the target user.
      • excellent: The target user is clearly defined (not too general) and almost all new features are of high-value to the target user. i.e. the product is very attractive to the target user.
    • B. Quality of user docs []: Evaluate based on the parts of the user guide written by the person, as reproduced in the project portfolio. Evaluate from an end-user perspective.

      • unable to judge: Less than 1 page worth of UG content written by the student.
      • low: Hard to understand, often inaccurate or missing important information.
      • medium: Needs some effort to understand; some information is missing.
      • high: Mostly easy to follow. Only a few areas need improvements.
      • excellent: Easy to follow and accurate. Just enough information, visuals, examples etc. (not too much either). Understandable to the target end user.
    • C. Quality of developer docs []: Evaluate based on the developer docs cited/reproduced in the respective project portfolio page. Evaluate from the perspective of a new developer trying to understand how the features are implemented.

      • unable to judge: One of these
        • less than 0.5 pages worth of content.
        • other problems in the document  e.g. looks like included wrong content.
      • low: One of these
        • Very small amount of content (i.e., 0.5 - 1 page).
        • Hardly any use to the reader (i.e., content doesn't make much sense or redundant).
        • Uses ad-hoc diagrams where UML diagrams could have been used instead.
        • Multiple notation errors in UML diagrams.
      • medium: Some diagrams, some descriptions, but does not help the reader that much  e.g. overly complicated diagrams.
      • high: Enough diagrams (at lest two kinds of UML diagrams used) and enough descriptions (about 2 pages worth) but explanations are not always easy to follow.
      • excellent: Easy to follow. Just enough information (not too much). Minimum repetition of content/diagrams. Good use of diagrams to complement text descriptions. Easy to understand diagrams with just enough details rather than very complicated diagrams that are hard to understand.
    • D. Depth of feature []: Evaluate the feature done by the student for difficulty, depth, and completeness. Note: examples given below assume that AB4 did not have the commands edit, undo, and redo.

      • unable to judge: You are unable to judge this aspect for some reason.
      • low : An easy feature  e.g. make the existing find command case insensitive.
      • medium : Moderately difficult feature, barely acceptable implementation  e.g. an edit command that requires the user to type all fields, even the ones that are not being edited.
      • high: One of the below
        • A moderately difficult feature but fully implemented  e.g. an edit command that allows editing any field.
        • A difficult feature with a reasonable implementation but some aspects are not covered  undo/redo command that only allows a single undo/redo.
      • excellent: A difficult feature, all reasonable aspects are fully implemented  undo/redo command that allows multiple undo/redo.
    • E. Amount of work []: Evaluate the amount of work, on a scale of 0 to 30.

      • Consider this PR (history command) as 5 units of effort which means this PR (undo/redo command) is about 15 points of effort. Given that 30 points matches an effort twice as that needed for the undo/redo feature (which was given as an example of an A grade project), we expect most students to be have efforts lower than 20.
      • Consider the main feature only. Exclude GUI inputs, but consider GUI outputs of the feature. Count all implementation/testing/documentation work as mentioned in that person's PPP. Also look at the actual code written by the person. We understand that it is not possible to know exactly which part of the code is for the main feature; make a best-guess judgement call based on the available info.
      • Do not give a high value just to be nice. If your estimate is wildly inaccurate, it means you are unable to estimate the effort required to implement a feature in a project that you are supposed to know well at this point. You will lose marks if that is the case.

Processing PE Bug Reports:

There will be a review period for you to respond to the bug reports you received.

Duration: The review period will start around 1 day after the PE (exact time to be announced) and will last until the following Wednesday midnight. However, you are recommended to finish this task ASAP, to minimize cutting into your exam preparation work.

Bug reviewing is recommended to be done as a team as some of the decisions need team consensus.

Instructions for Reviewing Bug Reports

  • First, don't freak out if there are lot of bug reports. Many can be duplicates and some can be false positives. In any case, we anticipate that all of these products will have some bugs and our penalty for bugs is not harsh. Furthermore, it depends on the severity of the bug. Some bug may not even be penalized.

  • Do not edit the subject or the description. Do not close bug reports. Your response (if any) should be added as a comment.

  • If the bug is reported multiple times, mark all copies EXCEPT one as duplicates using the duplicate tag (if the duplicates have different severity levels, you should keep the one with the highest severity). In addition, use this technique to indicate which issue they are duplicates of. Duplicates can be omitted from processing steps given below.

  • If a bug seems to be for a different product (i.e. wrongly assigned to your team), let us know (email prof).

  • Decide if it is a real bug and apply ONLY one of these labels.

Response Labels:

  • response.Accepted: You accept it as a bug.
  • response.Rejected: What tester treated as a bug is in fact the expected behavior. The penalty for rejecting a bug using an unjustifiable explanation is higher than the penalty if the same bug was accepted. You can reject bugs that you inherited from AB4.
  • response.CannotReproduce: You are unable to reproduce the behavior reported in the bug after multiple tries.
  • response.IssueUnclear: The issue description is not clear.
  • If applicable, decide the type of bug. Bugs without type- are considered type-FunctionalityBug by default (which are liable to a heavier penalty):

Bug Type Labels:

  • type-FunctionalityBug : the bug is a flaw in how the product works.
  • type-DocumentationBug : the bug is in the documentation.
  • If you disagree with the original severity assigned to the bug, you may change it to the correct level, in which case add a comment justifying the change. All such changes will be double-checked by the teaching team and unreasonable lowering of severity will be penalized extra.:

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Decide who should fix the bug. Use the Assignees field to assign the issue to that person(s). There is no need to actually fix the bug though. It's simply an indication/acceptance of responsibility. If there is no assignee, we will distribute the penalty for that bug (if any) among all team members.

  • Add an explanatory comment explaining your choice of labels and assignees.

  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.

Evaluates: How good are the sections you wrote for the user guide and the developer guide?

Based on: the relevant sections of your project portfolio. Criteria considered:

  • Explanation should be clear and written to match the audience.
  • Good use of visuals to complement text.
  • Use of correct UML notations (where applicable)

A. Process:

Evaluates: How well you did in project management related aspects of the project, as an individual and as a team

Based on: Supervisor observations of project milestones and GitHub data.

Milestones need to be reached the midnight before of the tutorial for it to be counted as achieved. To get a good grade for this aspect, achieve at least 60% of the recommended milestone progress.

Other criteria:

  • Good use of GitHub milestones
  • Good use of GitHub release mechanism
  • Good version control, based on the repo
  • Reasonable attempt to use the forking workflow
  • Good task definition, assignment and tracking, based on the issue tracker
  • Good use of buffers (opposite: everything at the last minute)
  • Project done iteratively and incrementally (opposite: doing most of the work in one big burst)

B. Team-based tasks:

Evaluates: how much you contributed to common team-based tasksteam-based tasks

Based on: peer evaluations and tutor observations

Relevant: [Admin Project Scope → Examples of team tasks ]

 

Here is a non-exhaustive list of team-tasks:

  1. Necessary general code enhancements e.g.,
    1. Work related to renaming the product
    2. Work related to changing the product icon
    3. Morphing the product into a different product
  2. Setting up the GitHub, Travis, AppVeyor, etc.
  3. Maintaining the issue tracker
  4. Release management
  5. Updating user/developer docs that are not specific to a feature  e.g. documenting the target user profile
  6. Incorporating more useful tools/libraries/frameworks into the product or the project workflow (e.g. automate more aspects of the project workflow using a GitHub plugin)



Exams

There is no midterm.

The final exam has two parts:

  • Part 1: MCQ questions (1 hour, 20 marks)
  • Part 2: Essay questions (1 hour, 20 marks)

Both papers will be given to you at the start but you need to answer Part 1 first (i.e. MCQ paper). It will be collected 1 hour after the exam start time (even if arrived late for the exam). You are free to start part 2 early if you finish Part 1 early.

Final Exam: Part 1 (MCQ)

Each MCQ question gives you a statement to evaluate.

An example statement

Testing is a Q&A activity

Unless stated otherwise, the meaning of answer options are
A: Agree. If the question has multiple statements, agree with all of them.
B: Disagree. If the question has multiple statements, disagree with at least one of them
C, D, E: Not used

Number of questions: 100

Note that you have slightly more than ½ minute for each question, which means you need to go through the questions fairly quickly.

Given the fast pace required by the paper, to be fair to all students, you will not be allowed to clarify doubts about questions (in Part 1) by talking to invigilators.

  • If a question is not clear, you can circle the question number in the question paper and write your doubt in the question paper, near that question.
  • If your doubt is justified (e.g. there is a typo in the question) or if many students found the question to be unclear, the examiner may decide to omit that question from grading.

Questions in Part 1 are confidential. You are not allowed to reveal Part 1 content to anyone after the exam. All pages of the assessment paper are to be returned at the end of the exam.

You will be given OCR forms (i.e., bubble sheets) to indicate your answers for Part 1. As each OCR form can accommodate only 50 answers, you will be given 2 OCR forms. Indicate your student number in both OCR forms.

To save space, we use the following notation in MCQ question. [x | y | z] means ‘x and z, but not y’

SE is [boring | useful | fun] means SE is not boring AND SE is useful AND SE is fun.

Consider the following statement:

  • IDEs can help with [writing | debugging | testing] code.

The correct response for it is Disagree because IDEs can help with all three of the given options, not just writing and testing.

Some questions will use underlines or highlighting to draw your attention to a specific part of the question. That is because those parts are highly relevant to the answer and we don’t want you to miss the relevance of that part.

Consider the statement below:

Technique ABC can be used to generate more test cases.

The word can is underlined because the decision you need to make is whether the ABC can or cannot be used to generate more test cases; the decision is not whether ABC can be used to generate more or better test cases.

Markers such as the one given below appears at left margin of the paper to indicate where the question corresponds to a new column in the OCR form. E.g. questions 11, 21, 31, etc. (a column has 10 questions). Such markers can help you to detect if you missed a question in the previous 10 questions. You can safely ignore those markers if you are not interested in making use of that additional hint.


Some questions have tags e.g., the question below has a tag JAVA. These tags provide additional context about the question. In the example below, the tag indicates that the code given in the question is Java code.


The exam paper is open-book: you may bring any printed or written materials to the exam in hard copy format. However, given the fast pace required by Part 1, you will not have time left to refer notes during that part of the exam.

💡 Mark the OCR form as you go, rather than planning to transfer your answers to the OCR form near the end.  Reason: Given there are 100 questions, it will be hard to estimate how much time you need to mass-transfer all answers to OCR forms.

💡 Write the answer in the exam paper as well when marking it in the OCR form.  Reason: It will reduce the chance of missing a question. Furthermore, in case you missed a question, it will help you correct the OCR form quickly.

💡 We have tried to avoid deliberately misleading/tricky questions. If a question seems to take a very long time to figure out, you are probably over-thinking it.

You will be given a practice exam paper to familiarize yourself with this slightly unusual exam format.

Final Exam: Part 2 (Essay)

Unlike in part 1, you can ask invigilators for clarifications if you found a question to be unclear in part 2.

Yes, you may use pencils when answering part 2.



Participation Marks

10 marks allocated for participation can be earned in the following ways (there are 30+ available marks to choose from):

  • Good peer ratings
    • Criteria for professional conduct (1 mark for each criterion, max 7)
    • Competency criteria (2 marks for each, max 6)
  • Quizzes
    • In-lecture quizzes (1 each, max 10 marks)
    • Post-lecture quizzes (0.5 each, max 5 marks)
  • Module admin tasks done on time and as instructed
    • Peer evaluations (1 marks each, max 3)
    • Pre-module survey (1 marks)
  • Enhanced AB1-AB3 (2 mark each, max 6 marks)

Relevant: [Admin Peer Evaluations → Criteria ]

 

Peer evaluation criteria: professional conduct

  • Professional Communication :
    • Communicates sufficiently and professionally. e.g. Does not use offensive language or excessive slang in project communications.
    • Responds to communication from team members in a timely manner (e.g. within 24 hours).
  • Punctuality: Does not cause others to waste time or slow down project progress by frequent tardiness.
  • Dependability: Promises what can be done, and delivers what was promised.
  • Effort: Puts in sufficient effort to, and tries their best to keep up with the module/project pace. Seeks help from others when necessary.
  • Quality: Does not deliver work products that seem to be below the student's competence level i.e. tries their best to make the work product as high quality as possible within her competency level.
  • Meticulousness:
    • Rarely overlooks submission requirements.
    • Rarely misses compulsory module activities such as pre-module survey.
  • Teamwork: How willing are you to act as part of a team, contribute to team-level tasks, adhere to team decisions, etc.

Peer evaluation criteria: competency

  • Technical Competency: Able to gain competency in all the required tools and techniques.
  • Mentoring skills: Helps others when possible. Able to mentor others well.
  • Communication skills: Able to communicate (written and spoken) well. Takes initiative in discussions.



Appendices


A: Module Principles

These are some of the main principles underlying the module structure.

The product is you, NOT what you build.

The software product you build is a side effect only. You are the product of this module. This means,

  • We may not take the most efficient route to building the software product. We take the route that allows you to learn the most.
  • Building a software product that is unique, creative, and shiny is not our priority (although we try to do a bit of that too). Learning to take pride in, and discovering the joy of, high quality software engineering work is our priority.

Following from that, we evaluate you on not just how much you've done, but also, how well you've done those things. Here are some of the aspects in which we focus on:

We appreciate ...

But we value more ...

Ability to deal with low-level details

Ability to abstract over details, generalize, see the big picture

A drive to learn latest and greatest technologies

Ability to make the best of given tools

Ability to find problems that interest you and solve them

Ability to solve the given problem to the best of your ability

Ability to burn the midnight oil to meet a deadline

Ability to schedule work so that the need for 'last minute heroics' is minimal

Preference to do things you like or things you are good at

Ability to buckle down and deliver on important things that you don't necessarily like or aren't good at

Ability to deliver desired end results

Ability to deliver in a way that shows how well you delivered (i.e. visibility of your work)

We learn together, NOT compete against each other.

You are not in a competition. Our grading is not forced on a bell curve.

Learn from each other. That is why we open-source your submissions.

Teach each other, even those in other teams. Those who do it well can become tutors next time.

Continuously engage, NOT last minute heroics.

We want to train you to do software engineering in a steady and repeatable manner that does not require 'last minute heroics'.

In this module, last minute heroics will not earn you a good project grade, and last minute mugging will not earn you a good exam grade.

Where you reach at the end matters, NOT what you knew at the beginning.

When you start the module, some others in the class may appear to know a lot more than you. Don't let that worry you. The final grade depends on what you know at the end, not what you knew to begin with. All marks allocated to intermediate deliverables are within the reach of everyone in the class irrespective of their prior knowledge.



B: Module Policies

Policy on following instructions

When working with others, especially in a large class such as CS2103/T,  it is very important that you adhere to standards, policies, and instructions imposed on everyone. Not doing so creates unnecessary headaches for everyone and puts your work attitude in a negative light. That is why we penalize repeated violations of instructions. On the other hand we do understand that humans are liable to make mistakes. That is why we only penalize repeated or frequent mistakes.

Policy on grading smaller/larger teams

As most of the work is graded individually, team sizes of 3, 4, or 5 is not expected to affect your grade. While managing larger teams is harder, larger teams have more collective know-how, which can cancel each other.

Policy on project work distribution

As most of the work is graded individually, it is OK to do less or more than equal share in your project team.

Related: [Admin: Project: Scope]

Policy on absence due to valid reasons (e.g. MC, LOA, University events)

There is no need to inform us. If you miss a lecture/tutorial for a valid reason, just do your best to catch up. We'll be happy to help if required. An occasional absence or two is not expected to affect your participation marks. 
Only if you fail to earn full marks for participation we will consider giving an alternative avenue to earn marks missed due to the absences.

Policy on email response time

Normally, the prof will respond within 24 hours if it was an email sent to the prof or a forum post directed at the prof. If you don't get a response within that time, please feel free to remind the prof. It is likely that the prof did not notice your post or the email got stuck somewhere.

Similarly we expect you to check email regularly and respond to emails written to you personally (not mass email) promptly.

Not responding to a personal email is a major breach of professional etiquette (and general civility). Imagine how pissed off you would be if you met the prof along the corridor, said 'Hi prof, good morning' and the prof walked away without saying anything back. Not responding to a personal email is just as bad. Always take a few seconds to at least acknowledge such emails.  It doesn't take long to type "Noted. Thanks" and hit 'send'.

The promptness of a reply is even more important when the email is requesting you for something that you cannot provide. Imagine you wrote to the prof requesting a reference letter and the prof did not respond at all because he/she did not want to give you one; You'll be quite frustrated because you wouldn't know whether to look for another prof or wait longer for a response. Saying 'No' is fine and in fact a necessary part of professional life; but saying nothing is not acceptable. If you didn't reply, the sender will not even know whether you received the email.

Policy on tech help

Do not expect your tutor to code or debug for you. We strongly discourage tutors from giving technical help directly to their own teams because we want to train you in troubleshooting tech problems yourselves. Allowing direct tech help from tutors transfers the troubleshooting responsibility to tutors.

It is ok to ask for help from classmates even for assignments, even from other teams, as long as you don't copy from others and submit as your own. It doesn't matter who is helping you as long as you are learning from it.

We encourage you to give tech help to each other, but do it in a way that the other person learns from it.

Related: [Admin: Appendix D: Getting Help]

Policy on publishing submissions

The source code are publicly available and are available for reuse by others without any restrictions. 
Is publishing submissions unfair to the team? We don't think so. If you were the first to think of something your peers are willing to adopt later, that means you are already ahead of them and they are unlikely to earn more marks by adopting your ideas.

Policy on plagiarism

We encourage sharing, but you should share with everyone in the class, not just a selected group. That is,

  • You are not allowed to share individual assignments with classmates directly.
  • You are not allowed to share project-related things with other teams directly.

You can even reuse each other's work subject to the 'reuse policy' given below.

If you submit code (or adopt ideas) taken from elsewhere, you need to comply with our reuse policy.

Detection:

  • Detecting plagiarism in code is quite easy. You are not fooling anyone by reordering code or renaming methods/variables.
  • As all your work is publicly visible on GitHub, sooner or later somebody will notice the plagiarism.

Penalties:

  • For submissions not affecting marks: We make a record of cases of plagiarism but we do not take further action. Such plagiarism does not disadvantage other students. Therefore, we prefer to spend all available resources on helping honest students to do better rather than to chase after dishonest students. If you think you gain something by plagiarizing, go ahead and do it. It's your choice and it's your loss.
  • For the final project/exam: Any case of claiming others' work as yours will be reported to the university for disciplinary action.

Policy on reuse

Reuse is encouraged. However, note that reuse has its own costs (such as the learning curve, additional complexity, usage restrictions, and unknown bugs). Furthermore, you will not be given credit for work done by others. Rather, you will be given credit for using work done by others.

  • You are allowed to reuse work from your classmates, subject to following conditions:
    • The work has been published by us or the authors.
    • You clearly give credit to the original author(s).
  • You are allowed to reuse work from external sources, subject to following conditions:
    • The work comes from a source of 'good standing' (such as an established open source project). This means you cannot reuse code written by an outside 'friend'.
    • You clearly give credit to the original author. Acknowledge use of third party resources clearly e.g. in the welcome message, splash screen (if any) or under the 'about' menu. If you are open about reuse, you are less likely to get into trouble if you unintentionally reused something copyrighted.
    • You do not violate the license under which the work has been released. Please  do not use 3rd-party images/audio in your software unless they have been specifically released to be used freely. Just because you found it in the Internet does not mean it is free for reuse.
    • Always get permission from us before you reuse third-party libraries. Please post your 'request to use 3rd party library' in our forum. That way, the whole class get to see what libraries are being used by others.

Giving credit for reused work

Given below are how to give credit for things you reuse from elsewhere. These requirements are specific to this module  i.e., not applicable outside the module (outside the module you should follow the rules specified by your employer and the license of the reused work)

If you used a third party library:

  • Mention in the README.adoc (under the Acknowledgements section)
  • mention in the Project Portfolio Page if the library has a significant relevance to the features you implemented

If you reused code snippets found on the Internet  e.g. from StackOverflow answers or
referred code in another software or
referred project code by current/past student:

  • If you read the code to understand the approach and implemented it yourself, mention it as a comment
    Example:
    //Solution below adapted from https://stackoverflow.com/a/16252290
    {Your implmentation of the reused solution here ...}
    
  • If you copy-pasted a non-trivial code block (possibly with minor modifications  renaming, layout changes, changes to comments, etc.), also mark the code block as reused code (using @@author tags)
    Format:
    //@@author {yourGithubUsername}-reused
    //{Info about the source...}
    
    {Reused code (possibly with minor modifications) here ...}
    
    //@@author
    
    Example of reusing a code snippet (with minor modifications):
    persons = getList()
    //@@author johndoe-reused
    //Reused from https://stackoverflow.com/a/34646172 with minor modifications
    Collections.sort(persons, new Comparator<CustomData>() {
        @Override
        public int compare(CustomData lhs, CustomData rhs) {
            return lhs.customInt > rhs.customInt ? -1 : (lhs.customInt < rhs.customInt) ? 1 : 0;
        }
    });
    //@@author
    return persons;
    
 

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Policy on help from outsiders

In general, you are not allowed to involve outsiders in your project except your team members and the teaching team. However, It is OK to give your product to others for the purpose of getting voluntary user feedback. It is also OK to learn from others as long as they don't do your project work themselves.

Policy on suggested length for submissions

We don't usually give a strict page limit for documents such as User Guide and the Developer Guide. You need to decide yourself how long the document should be based on the purpose and the intended audience. You can determine the level of details required based on the samples we provide.



C: Frequently Asked Questions

Where is everything?

The Schedule pageSchedule page presents all you need to know in chronological order while the other pages have some of the same content organized by topic.

The Schedule pageSchedule page is the one page you need to refer weekly. Although there is a lot of content in the Admin Info pageAdmin Info page and the Textbook page -- which you are welcome to read in those respective pages -- the same content is also embedded in the relevant weeks of the Schedule page. Embedded extracts usually appear in expandable panels and can be identified by the symbol in the panel title.

What are the differences between the T and the non-T version of the module?

Same lectures, same exam. Separate tutorials, separate project grading. Unless specified otherwise, whatever is stated for one module applies to the other.

Why the workload is so high?

CS2103/T prepares you for many higher-level project modules (CS3216/7, CS3203, CS3281/2, etc.), each requiring a slightly different skill set. It is also the only SE module some of you do before going for industry internships. Therefore, we have to cover many essential SE concepts/skills and also provide enough exercises for you to practice those skills. This is also why we don't have time to go very deep into any of the topics.

Remember, everything you learn here is going to be useful in a SE-related career.

Also, consider this a gradual introduction to 'heavy' modules; most project modules you do after this are going to be much heavier 😛

How to reduce the workload? You can omit Learning Outcomes rated . Furthermore, control the project workload by using no more than a fixed amount of time weekly on the project (e.g., 1 day).

What are the extra requirements to get an A+?

In CS2103/T, A+ is not given simply based on the final score. To get an A+ you should,

  • score enough to get an A
  • be considered technically competent by peers and tutor (based on peer evaluations and tutor observations)
  • be considered helpful by peers (based on peer evaluations and tutor observations)
    • In particular, you are encouraged to be active on the slack channel and our forum and give your inputs to ongoing discussions so that other students can benefit from your relatively higher expertise that makes you deserve an A+.
    • Whenever you can, go out of your way to review PRs created by other team members.

Why so much bean counting?

Sometimes, small things matter in big ways. e.g., all other things being equal, a job may be offered to the candidate who has the neater looking CV although both have the same qualifications. This may be unfair, but that's how the world works. Students forget this harsh reality when they are in the protected environment of the school and tend to get sloppy with their work habits. That is why we reward all positive behavior, even small ones (e.g., following precise submission instructions, arriving on time etc.).

But unlike the real world, we are forgiving. That is why you can still earn full 10 marks of the participation marks even if you miss a few things here and there.

Related article: This Is The Personality Trait That Most Often Predicts Success (this is why we reward things like punctuality).

Why you force me to visit a separate website instead of using IVLE?

We have a separate website because some of the module information does not fit into the structure imposed by IVLE.

On a related note, keep in mind that 'hunting and gathering' of relevant information is one of the skills you need to survive 'in the wild'. Do not always expect all relevant materials to appear 'magically' in some kind of 'work bin'.

Why slides are not detailed?

Slides are not meant to be documents to print and study for exams. Their purpose is to support the lecture delivery and keep you engaged during the lecture. That's why our slides are less detailed and more visual.

Why so much self-study?

Self-study is a critical survival skill in SE industry. Lectures will show you the way, but absorbing content is to be done at your own pace, by yourself. In this module, we still tell you what content to study and also pass most of the content to you. After you graduate, you have to decide what to study and find your own content too.

What if I don’t carry around a laptop?

If you do not have a laptop or prefer not to bring the laptop, it is up to you to show your work to the tutor in some way (e.g. by connecting to your home PC remotely), without requiring extra time/effort from the tutor or team members.

Reason: As you enjoy the benefits of not bring the laptop; you (not others) should bear the cost too.

Why very narrow project scope?

Defining your own unique project is more fun.

But, wider scope → more diverse projects → harder for us to go deep into your project. The collective know-how we (i.e., students and the teaching team) have built up about SE issues related to the project become shallow and stretched too thinly. It also affects fairness of grading.

That is why a strictly-defined project is more suitable for a first course in SE that focuses on nuts-and-bolts of SE. After learning those fundamentals, in higher level project modules you can focus more on the creative side of software projects without being dragged down by nuts-and-bolts SE issues (because you already know how to deal with them). However, we would like to allow some room for creativity too. That is why we let you build products that are slight variations of a given theme.

Also note: The freedom to do 'anything' is not a necessary condition for creativity. Do not mistake being different for being creative. In fact, the more constrained you are, the more you need creativity to stand out.

Why project requirements are so vague?

"You tell me exactly what to do - I do that - you pay me (in grades)" is a model for contract work, not for learning. Being able to survive in imprecise, uncertain, volatile problem contexts is precisely what we are trying to teach you.

For example, the best way to communicate something often depends on what is being communicated. That is why we don't specify the precise content for project documents. Instead, we aim to refine project documents iteratively. We believe the learning experience will be richer if we let you decide the best way to present your project information rather than just following our instructions blindly. For example, in real-life projects you are rarely told which diagrams to draw; that is a decision you have to make yourself.

Why I’m not allowed to use my favorite tool/framework/language etc.?

We have chosen a basic set of tools after considering ease of learning, availability, typical-ness, popularity, migration path to other tools, etc. There are many reasons for limiting your choices:

Pedagogical reasons:

  • Sometimes 'good enough', not necessarily the best, tools are a better fit for beginners: Most bleeding edge, most specialized, or most sophisticated tools are not suitable for a beginner course. After mastering our toolset, you will find it easy to upgrade to such high-end tools by yourself. We do expect you to eventually (after this module) migrate to better tools and, having learned more than one tool, to attain a more general understanding about a family of tools.
  • We want you to learn to thrive under given conditions: As a professional Software Engineer, you must learn to be productive in any given tool environment, rather than insist on using your preferred tools. It is usually in small companies doing less important work that you get to chose your own toolset. Bigger companies working on mature products often impose some choices on developers, such as the project management tool, code repository, IDE, language etc. For example, Google used SVN as their revision control software until very recently, long after SVN fell out of popularity among developers. Sometimes this is due to cost reasons (tool licensing cost), and sometimes due to legacy reasons (because the tool is already entrenched in their code base).
    While programming in school is often a solo sport, programming in the industry is a team sport. As we are training you to become professional software engineers, it is important to get over the psychological hurdle of needing to satisfy individual preferences and get used to making the best of a given environment.

Practical reasons:

  • Some of the LOs are tightly coupled to tools. Allowing more tools means tutors need to learn more tools, which increases their workload.
  • We provide learning resources for tools. e.g. 'Git guides'. Allowing more tools means we need to produce more resources.
  • When all students use the same tool, the collective expertise of the tool is more, increasing the opportunities for you to learn from each others.

Meanwhile, feel free to share with peers your experience of using other tools.

Why so many submissions?

The high number of submissions is not meant to increase workload but to spread it across the semester. Learning theory and applying them should be done in parallel to maximize the learning effect. That can happen only if we spread theory and 'application of theory' (i.e., project work) evenly across the semester.

Why aren't we allowed to build a new product from scratch?

There are many reasons. One of them is that most of you will be working with existing software in your first few years of the career while hardly any school projects train you to work with existing code bases. We decided to bite the bullet and use CS2103/T to train you to work in existing code bases.

Why submission requirements differ between CS2103/T and CS2101?

They do, and they should.

CS2103T communication requirements are limited to a very narrow scope (i.e., communicate about the product to users and developers). CS2101 aims to teach you technical communication in a much wider context. While you may be able to reuse some of the stuff across the two modules, submissions are not intended to be exactly the same.



D: Getting help in this module

This guide is mostly about getting tech help, but it also applies to getting clarifications on module topics too. e.g. what is the difference between refactoring and rewriting?


We want to move you away from 'hand holding' and make you learn how to solve problems on your own. This is a vital survival skill in the industry and it needs practice.

Whether it is a technical problem (e.g. error when using the IDE) or a doubt about a concept (e.g. what is the difference between scripted testing and exploratory testing?)  the teaching team is happy to work with you when you look for a solution/answer, but we do not do it for you. We discourage unconditional direct help from tutors because we want you to learn to help yourself. Yes, we believe in ‘tough love’😝.

The question you should always ask yourself is, 'how do I solve this problem if the lecturer/tutors are not around to help me?'


What not to do:

  • When faced with a technical problem or a doubt about a concept, don't fire off an email lecturer/tutor immediately, unless it is something only the lecturer/tutor is supposed to know.

What to do:

  • Check what is given: Check if the problem/concept has been discussed in the lectures, textbook, or the list of resources given to you. Yes, it is easier for you to write an email to the tutor/lecturer instead, but that shouldn't be your default behavior. We know that sometimes it is difficult to find stuff in the resources we have provided. But you should try first.

  • Search: It is very likely the answer already exists somewhere in the cyberspace. Almost every programming-related question has been answered in places like stackoverflow. Don't give an opportunity for someone to ask you to STFW.
    Pay attention to the error message you encounter. Sometimes it also contains hints as to how to fix the problem. Even if not, a web search on the error message is a good starting point.  

  • Ask peers:

    Ask your team members.

    Ask classmates using the module forum or the slack channel. Even if you figured out one way to solve a problem, discussing it on a public forum might lead you to better ways of solving it, and will help other classmates who are facing similar problems too. If you are really shy to ask questions in the forum, you may use this form to submit your question anonymously which we will then post in the forum.


    Rubber duck debugging is an informal term used in software engineering to refer to a method of debugging code. The name is a reference to a story in the book The Pragmatic Programmer in which a programmer would carry around a rubber duck and debug his code by forcing himself to explain it, line-by-line, to the duck.

    [for more, see wikipedia entry]

  • Ask the world using programming forums such as stackoverflow.

    Here are some tips for posting help request:

    • PLEASE search for existing answers before you post your question in those public forums; You don't want to appear as a 'clueless' or 'too lazy to do your research' person in a public forum.

    • Learn to isolate the problem. "My code doesn't work" isn't going to help even if you post the whole code online. Others don't have time to go through all of your code. Isolate the part that doesn't work and strip it down to the bare minimum that is enough reproduce the error. Sometimes, this process actually helps you to figure out the problem yourself. If not, at least it increases the chance of someone else being able to help you.

      💡 How to isolate problematic code? Delete code (one bit at a time) that is confirmed as not related to the problem. Do that until you can still reproduce the problem with the least amount of code remaining.

    • Generalize the problem. "How to write tasks to a text file using Java" is too specific to what you are working on. You are more likely to find help if you post a thread called (or search for) "How to write to a file using Java".

    • Explain well. Conversations via online forums take time. If you post everything that is relevant to your problem, your chances of getting an answer in the first try is higher. If others have to ask you more questions before they can help you, it will take longer. But this doesn't mean you dump too much information into the thread either.

      💡 Know what these stand for: RTFM, STFW, GIYF

  • Raise your question during a tutorial. Some questions can be discussed with the tutor and tutorial-mates. What kind of questions are suitable to discuss with the tutor? Consider these two questions you might want to ask a tutor:
    • Good This is how I understood/applied coupling. Is that correct? - Such questions are welcome. Reason:This question shows you have put in some effort to learn the topic and seeking further clarification from the tutor.
    • Bad What is coupling? - Such questions are discouraged. Reason: This question implies you haven’t done what you could to learn the topic in concern.
  • Talk to the lecturer before or after the lecture. The lecturer will be at the lecture venue from 30 minutes before the start of the lecture.

  • Request our help: Failing all above, you can always request for help by emailing the lecturer.

Resources



E: Using GitHub

Creating a GitHub account

Create a personal GitHub account if you don't have one yet.

  1. You are advised to choose a sensible GitHub username as you are likely to use it for years to come in professional contexts.

  2. Strongly recommended: Complete your GitHub profile. In particular,

    • Specify your full name.
    • Upload a suitable profile photo (i.e. a recent photo of your face).

    The GitHub profile is useful for the tutors and classmates to identify you. If you are reluctant to share your info in your long-term GitHub account, you can remove those details after the module is over or create a separate GitHub account just for the module.

Setting Git Username to Match GitHub Username

We use various tools to analyze your code. For us to be able to identify your commits, you should use the GitHub username as your Git username as well. If there is a mismatch, or if you use multiple user names for Git, our tools might miss some of your work and as a result you might not get credit for some of your work.

In each Computer you use for coding, after installing Git, you should set the Git username as follows.

  1. Open a command window that can run Git commands (e.g., Git bash window)
  2. Run the command git config --global user.name YOUR_GITHUB_USERNAME
    e.g., git config --global user.name JohnDoe

More info about setting Git username is here.

Submitting Pull Requests as evidence of an LO

  1. Fork the repo to your personal GitHub account, if you haven't done so already.

  2. Create a branch named after the LO ID e.g. W2.2b Remember to switch to master branch before creating the new branch.

  3. Commit your changes to that branch. Push to your fork.

  4. Create a Pull Request against the master branch of the repo https://github.com/nus-cs2103-AY1819S1/{repo_name}
    e.g. https://github.com/nus-cs2103-AY1819S1/addressbook-level2 (do not create PRs against the upstream repo at se-edu org)

    PR name should be: [LO_ID][TEAM_ID]Your Name
    e.g. If you are in tutorial W09 (i.e. Wednesday 9am) and team 1, [W2.2b][W09-1]James Yong. Your Team ID can be found in this page. Note that our tutorial IDs are different from those shown in CORS/IVLE. Our tutorial IDs are given in the panel below.

Relevant: [Admin Tutorials → Tutorial Timetable ]

 

Our tutorials start on week 2 (even before CORS tutorial bidding is over), not in week 3 as other modules do. CS2103 (not CS2103T) students need to choose a temporary tutorial slot for week 2 tutorial. We'll inform you the procedure to do so in due course.

Our tutorial IDs are different from CORS. Format: W09 means Wednesday 0900 and so on.

Module Tutorial ID (ID in CORS) Time Venue Tutors (contact details)
CS2103 W10 (T01) Wed 1000 COM1-B103 (ALL)* TBD
CS2103T W12 (T01) Wed 1200 COM1-0210 (SR10) TBD
CS2103 W13 (T02) Wed 1300 COM1-0210 (SR10) TBD
CS2103T W14 (T02) Wed 1400 COM1-0210 (SR10) TBD
CS2103T W16 (T03) Wed 1600 COM1-B103 (ALL) TBD
CS2103T W17 (T04) Wed 1700 COM1-B103 (ALL) TBD
CS2103T T09 (T06) Thu 0900 COM1-0210 (SR10) TBD
CS2103 T10 (T04) Thu 1000 COM1-0210 (SR10) TBD
CS2103T T12 (T07) Thu 1200 COM1-0210 (SR10) TBD
CS2103 T13 (T06) Thu 1300 COM1-0210 (SR10) TBD
CS2103T T16 (T08) Thu 1600 COM1-0210 (SR10) TBD
CS2103T F10 (T10) Fri 1000 COM1-0210 (SR10) TBD
CS2103 F11 (T09) Fri 1100 COM1-0210 (SR10) TBD

*ALL: Active Learning Room

  1. Check the 'Files Changed' tab on GitHub to confirm the PR contains intended changes only.

  2. If the content of the PR is not as you expected, you can fix those problems in your local repo, commit, and push those new commits to the fork. The PR content will update automatically to match new commits. Alternatively, you can close that PR and create a new one with the correct content.

  3. If your PR adapted/referred code from elsewhere (e.g. a stackoverflow post or a classmate's PR -- which is allowed, even encouraged), acknowledge the source in your PR description text. e.g. Some code adapted from #2431 (I followed the same technique for parsing user command)

  4. If the PR is not ready for review yet, add a comment Work in progress. When the PR is ready for review later, add a comment Ready for review If there is no comment, we assume the PR is ready for review.

Organization setup

Please follow the organization/repo name format precisely because we use scripts to download your code or else our scripts will not be able to detect your work.

After receiving your team ID, one team member should do the following steps:

  • Create a GitHub organization with the following details:
    • Organization name : CS2103-AY1819S1-TEAM_ID. e.g.  CS2103-AY1819S1-W12-1
    • Plan:  Open Source ($0/month)
  • Add members to the organization:
    • Create a team called developers to your organization.
    • Add your team members to the developers team.

Repo setup

Only one team member:

  1. Fork Address Book Level 4 to your team org.
  2. Rename the forked repo as main. This repo (let's call it the team repo) is to be used as the repo for your project.
  3. Ensure the issue tracker of your team repo is enabled. Reason: our bots will be posting your weekly progress reports on the issue tracker of your team repo.
  4. Ensure your team members have the desired level of access to your team repo.
  5. Enable Travis CI for the team repo.
  6. Set up auto-publishing of docs. When set up correctly, your project website should be available via the URL https://nus-cs2103-ay1819s1-{team-id}.github.io/main e.g., https://cs2103-ay1819s1-w13-1.github.io/main/. This also requires you to enable the GitHub Pages feature of your team repo and configure it to serve the website from the gh-pages branch.
  7. create a team PR for us to track your project progress: i.e., create a PR from your team repo master branch to [nus-cs2103-AY1819S1/addressbook-level4] master branch. PR name: [Team ID] Product Name e.g., [T09-2] Contact List Pro.  As you merge code to your team repo's master branch, this PR will auto-update to reflect how much your team's product has progressed. In the PR description @mention the other team members so that they get notified when the tutor adds comments to the PR.

All team members:

  1. Watchthe main repo (created above) i.e., go to the repo and click on the watch button to subscribe to activities of the repo
  2. Fork the main repo to your personal GitHub account.
  3. Clone the fork to your Computer.
  4. Recommended: Set it up as an Intellij project (follow the instructions in the Developer Guide carefully).
  5. Set up the developer environment in your computer. You are recommended to use JDK 9 for AB-4 as some of the libraries used in AB-4 have not updated to support Java 10 yet. JDK 9 can be downloaded from the Java Archive.

Note that some of our download scripts depend on the following folder paths. Please do not alter those paths in your project.

  • /src/main
  • /src/test
  • /docs

Workflow

Before you do any coding for the project,

  • Ensure you have set the Git username correctly (as explained in Appendix E) in all Computers you use for coding.
  • Read our reuse policy (in Admin: Appendix B), in particular, how to give credit when you reuse code from the Internet or classmates:
 

Setting Git Username to Match GitHub Username

We use various tools to analyze your code. For us to be able to identify your commits, you should use the GitHub username as your Git username as well. If there is a mismatch, or if you use multiple user names for Git, our tools might miss some of your work and as a result you might not get credit for some of your work.

In each Computer you use for coding, after installing Git, you should set the Git username as follows.

  1. Open a command window that can run Git commands (e.g., Git bash window)
  2. Run the command git config --global user.name YOUR_GITHUB_USERNAME
    e.g., git config --global user.name JohnDoe

More info about setting Git username is here.

 

Policy on reuse

Reuse is encouraged. However, note that reuse has its own costs (such as the learning curve, additional complexity, usage restrictions, and unknown bugs). Furthermore, you will not be given credit for work done by others. Rather, you will be given credit for using work done by others.

  • You are allowed to reuse work from your classmates, subject to following conditions:
    • The work has been published by us or the authors.
    • You clearly give credit to the original author(s).
  • You are allowed to reuse work from external sources, subject to following conditions:
    • The work comes from a source of 'good standing' (such as an established open source project). This means you cannot reuse code written by an outside 'friend'.
    • You clearly give credit to the original author. Acknowledge use of third party resources clearly e.g. in the welcome message, splash screen (if any) or under the 'about' menu. If you are open about reuse, you are less likely to get into trouble if you unintentionally reused something copyrighted.
    • You do not violate the license under which the work has been released. Please  do not use 3rd-party images/audio in your software unless they have been specifically released to be used freely. Just because you found it in the Internet does not mean it is free for reuse.
    • Always get permission from us before you reuse third-party libraries. Please post your 'request to use 3rd party library' in our forum. That way, the whole class get to see what libraries are being used by others.

Giving credit for reused work

Given below are how to give credit for things you reuse from elsewhere. These requirements are specific to this module  i.e., not applicable outside the module (outside the module you should follow the rules specified by your employer and the license of the reused work)

If you used a third party library:

  • Mention in the README.adoc (under the Acknowledgements section)
  • mention in the Project Portfolio Page if the library has a significant relevance to the features you implemented

If you reused code snippets found on the Internet  e.g. from StackOverflow answers or
referred code in another software or
referred project code by current/past student:

  • If you read the code to understand the approach and implemented it yourself, mention it as a comment
    Example:
    //Solution below adapted from https://stackoverflow.com/a/16252290
    {Your implmentation of the reused solution here ...}
    
  • If you copy-pasted a non-trivial code block (possibly with minor modifications  renaming, layout changes, changes to comments, etc.), also mark the code block as reused code (using @@author tags)
    Format:
    //@@author {yourGithubUsername}-reused
    //{Info about the source...}
    
    {Reused code (possibly with minor modifications) here ...}
    
    //@@author
    
    Example of reusing a code snippet (with minor modifications):
    persons = getList()
    //@@author johndoe-reused
    //Reused from https://stackoverflow.com/a/34646172 with minor modifications
    Collections.sort(persons, new Comparator<CustomData>() {
        @Override
        public int compare(CustomData lhs, CustomData rhs) {
            return lhs.customInt > rhs.customInt ? -1 : (lhs.customInt < rhs.customInt) ? 1 : 0;
        }
    });
    //@@author
    return persons;
    
 

Adding @@author tags indicate authorship

  • Mark your code with a //@@author {yourGithubUsername}. Note the double @.
    The //@@author tag should indicates the beginning of the code you wrote. The code up to the next //@@author tag or the end of the file (whichever comes first) will be considered as was written by that author. Here is a sample code file:

    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author sarahkhoo
    method 3 ...
    //@@author johndoe
    method 4 ...
    
  • If you don't know who wrote the code segment below yours, you may put an empty //@@author (i.e. no GitHub username) to indicate the end of the code segment you wrote. The author of code below yours can add the GitHub username to the empty tag later. Here is a sample code with an empty author tag:

    method 0 ...
    //@@author johndoe
    method 1 ...
    method 2 ...
    //@@author
    method 3 ...
    method 4 ...
    
  • The author tag syntax varies based on file type e.g. for java, css, fxml. Use the corresponding comment syntax for non-Java files.
    Here is an example code from an xml/fxml file.

    <!-- @@author sereneWong -->
    <textbox>
      <label>...</label>
      <input>...</input>
    </textbox>
    ...
    
  • Do not put the //@@author inside java header comments.
    👎

    /**
      * Returns true if ...
      * @@author johndoe
      */
    

    👍

    //@@author johndoe
    /**
      * Returns true if ...
      */
    

What to and what not to annotate

  • Annotate both functional and test code There is no need to annotate documentation files.

  • Annotate only significant size code blocks that can be reviewed on its own  e.g., a class, a sequence of methods, a method.
    Claiming credit for code blocks smaller than a method is discouraged but allowed. If you do, do it sparingly and only claim meaningful blocks of code such as a block of statements, a loop, or an if-else statement.

    • If an enhancement required you to do tiny changes in many places, there is no need to annotate all those tiny changes; you can describe those changes in the Project Portfolio page instead.
    • If a code block was touched by more than one person, either let the person who wrote most of it (e.g. more than 80%) take credit for the entire block, or leave it as 'unclaimed' (i.e., no author tags).
    • Related to the above point, if you claim a code block as your own, more than 80% of the code in that block should have been written by yourself. For example, no more than 20% of it can be code you reused from somewhere.
    • 💡 GitHub has a blame feature and a history feature that can help you determine who wrote a piece of code.
  • Do not try to boost the quantity of your contribution using unethical means such as duplicating the same code in multiple places. In particular, do not copy-paste test cases to create redundant tests. Even repetitive code blocks within test methods should be extracted out as utility methods to reduce code duplication. Individual members are responsible for making sure code attributed to them are correct. If you notice a team member claiming credit for code that he/she did not write or use other questionable tactics, you can email us (after the final submission) to let us know.

  • If you wrote a significant amount of code that was not used in the final product,

    • Create a folder called {project root}/unused
    • Move unused files (or copies of files containing unused code) to that folder
    • use //@@author {yourGithubUsername}-unused to mark unused code in those files (note the suffix unused) e.g.
    //@@author johndoe-unused
    method 1 ...
    method 2 ...
    

    Please put a comment in the code to explain why it was not used.

  • If you reused code from elsewhere, mark such code as //@@author {yourGithubUsername}-reused (note the suffix reused) e.g.

    //@@author johndoe-reused
    method 1 ...
    method 2 ...
    
  • You can use empty @@author tags to mark code as not yours when RepoSense attribute the to you incorrectly.

    • Code generated by the IDE/framework, should not be annotated as your own.

    • Code you modified in minor ways e.g. adding a parameter. These should not be claimed as yours but you can mention these additional contributions in the Project Portfolio page if you want to claim credit for them.

 

At the end of the project each student is required to submit a Project Portfolio Page.

  • Objective:

    • For you to use  (e.g. in your resume) as a well-documented data point of your SE experience
    • For us to use as a data point to evaluate your,
      • contributions to the project
      • your documentation skills
  • Sections to include:

    • Overview: A short overview of your product to provide some context to the reader.

    • Summary of Contributions:

      • Code contributed: Give a link to your code on Project Code Dashboard, which should be https://nus-cs2103-ay1819s1.github.io/cs2103-dashboard/#=undefined&search=githbub_username_in_lower_case (replace githbub_username_in_lower_case with your actual username in lower case e.g., johndoe). This link is also available in the Project List Page -- linked to the icon under your photo.
      • Main feature implemented: A summary of the main feature (the so called major enhancement) you implemented
      • Other contributions:
        • Other minor enhancements you did which are not related to your main feature
        • Contributions to project management e.g., setting up project tools, managing releases, managing issue tracker etc.
        • Evidence of helping others e.g. responses you posted in our forum, bugs you reported in other team's products,
        • Evidence of technical leadership e.g. sharing useful information in the forum
    • Contributions to the User Guide: Reproduce the parts in the User Guide that you wrote. This can include features you implemented as well as features you propose to implement.
      The purpose of allowing you to include proposed features is to provide you more flexibility to show your documentation skills. e.g. you can bring in a proposed feature just to give you an opportunity to use a UML diagram type not used by the actual features.

    • Contributions to the Developer Guide: Reproduce the parts in the Developer Guide that you wrote. Ensure there is enough content to evaluate your technical documentation skills and UML modelling skills. You can include descriptions of your design/implementations, possible alternatives, pros and cons of alternatives, etc.

    • If you plan to use the PPP in your Resume, you can also include your SE work outside of the module (will not be graded)

  • Format:

    • File name: docs/team/githbub_username_in_lower_case.adoc e.g., docs/team/johndoe.adoc

    • Follow the example in the AddressBook-Level4, but ignore the following two lines in it.

      • Minor enhancement: added a history command that allows the user to navigate to previous commands using up/down keys.
      • Code contributed: [Functional code] [Test code] {give links to collated code files}
    • 💡 You can use the Asciidoc's include feature to include sections from the developer guide or the user guide in your PPP. Follow the example in the sample.

    • It is assumed that all contents in the PPP were written primarily by you. If any section is written by someone else  e.g. someone else wrote described the feature in the User Guide but you implemented the feature, clearly state that the section was written by someone else  (e.g. Start of Extract [from: User Guide] written by Jane Doe).  Reason: Your writing skills will be evaluated based on the PPP

    • Page limit: If you have more content than the limit given below, shorten (or omit some content) so that you do not exceed the page limit. Having too much content in the PPP will be viewed unfavorably during grading. Note: the page limits given below are after converting to PDF format. The actual amount of content you require is actually less than what these numbers suggest because the HTML → PDF conversion adds a lot of spacing around content.

      Content Limit
      Overview + Summary of contributions 0.5-1
      Contributions to the User Guide 1-3
      Contributions to the Developer Guide 3-6
      Total 5-10

Follow the forking workflow in your project up to v1.1. In particular,

  • Get team members to review PRs. A workflow without PR reviews is a risky workflow.
  • Do not merge PRs failing CI. After setting up Travis, the CI status of a PR is reported at the bottom of the PR page. The screenshot below shows the status of a PR that is passing all CI checks.

    If there is a failure, you can click on the Details link in corresponding line to find out more about the failure. Once you figure out the cause of the failure, push the a fix to the PR.
  • After setting up Netlify, you can use Netlify PR Preview to preview changes to documentation files, if the PR contains updates to documentation. To see the preview, click on the Details link in front of the Netlify status reported (refer screenshot above).

After completing v1.1, you can adjust process rigor to suit your team's pace, as explained below.

  • Reduce automated tests have benefits, but they can be a pain to write/maintain; GUI tests are especially hard to maintain because their behavior can sometimes depend on things such as the OS, resolution etc.
    It is OK to get rid of some of the troublesome tests and rely more on manual testing instead. The less automated tests you have, the higher the risk of regressions; but it may be an acceptable trade-off under the circumstances if tests are slowing you down too much.
    There is no direct penalty for removing GUI tests. Also note our expectation on test code.

  • Reduce automated checks: You can also reduce the rigor of checkstyle checks to expedite PR processing.

  • Switch to a lighter workflow: While forking workflow is the safest, it is also rather heavy. You an switch to a simpler workflow if the forking workflow is slowing you down. Refer the textbook to find more about alternative workflows: branching workflow, centralized workflow. However, we still recommend that you use PR reviews, at least for PRs affecting others' features.

You can also increase the rigor/safety of your workflow in the following ways:

  • Use GitHub's Protected Branches feature to protect your master branch against rogue PRs.
 
  • There is no requirement for a minimum coverage level. Note that in a production environment you are often required to have at least 90% of the code covered by tests. In this project, it can be less. The less coverage you have, the higher the risk of regression bugs, which will cost marks if not fixed before the final submission.
  • You must write some tests so that we can evaluate your ability to write tests.
  • How much of each type of testing should you do? We expect you to decide. You learned different types of testing and what they try to achieve. Based on that, you should decide how much of each type is required. Similarly, you can decide to what extent you want to automate tests, depending on the benefits and the effort required.
  • Applying TDD is optional. If you plan to test something, it is better to apply TDD because TDD ensures that you write functional code in a testable way. If you do it the normal way, you often find that it is hard to test the functional code because the code has low testability.
 

Project Management → Revision Control →

Forking Flow

In the forking workflow, the 'official' version of the software is kept in a remote repo designated as the 'main repo'. All team members fork the main repo create pull requests from their fork to the main repo.

To illustrate how the workflow goes, let’s assume Jean wants to fix a bug in the code. Here are the steps:

  1. Jean creates a separate branch in her local repo and fixes the bug in that branch.
  2. Jean pushes the branch to her fork.
  3. Jean creates a pull request from that branch in her fork to the main repo.
  4. Other members review Jean’s pull request.
  5. If reviewers suggested any changes, Jean updates the PR accordingly.
  6. When reviewers are satisfied with the PR, one of the members (usually the team lead or a designated 'maintainer' of the main repo) merges the PR, which brings Jean’s code to the main repo.
  7. Other members, realizing there is new code in the upstream repo, sync their forks with the new upstream repo (i.e. the main repo). This is done by pulling the new code to their own local repo and pushing the updated code to their own fork.

Issue tracker setup

We recommend you configure the issue tracker of the main repo as follows:

  • Delete existing labels and add the following labels.
    💡 Issue type labels are useful from the beginning of the project. The other labels are needed only when you start implementing the features.

Issue type labels:

  • type.Epic : A big feature which can be broken down into smaller stories e.g. search
  • type.Story : A user story
  • type.Enhancement: An enhancement to an existing story
  • type.Task : Something that needs to be done, but not a story, bug, or an epic. e.g. Move testing code into a new folder)
  • type.Bug : A bug

Status labels:

  • status.Ongoing : The issue is currently being worked on. note: remove this label before closing an issue.

Priority labels:

  • priority.High : Must do
  • priority.Medium : Nice to have
  • priority.Low : Unlikely to do

Bug Severity labels:

  • severity.Low : A flaw that is unlikely to affect normal operations of the product. Appears only in very rare situations and causes a minor inconvenience only.
  • severity.Medium : A flaw that causes occasional inconvenience to some users but they can continue to use the product.
  • severity.High : A flaw that affects most users and causes major problems for users. i.e., makes the product almost unusable for most users.
  • Create following milestones : v1.0v1.1v1.2v1.3v1.4,

  • You may configure other project settings as you wish. e.g. more labels, more milestones

Project Schedule Tracking

In general, use the issue tracker (Milestones, Issues, PRs, Tags, Releases, and Labels) for assigning, scheduling, and tracking all noteworthy project tasks, including user stories. Update the issue tracker regularly to reflect the current status of the project. You can also use GitHub's Projects feature to manage the project, but keep it linked to the issue tracker as much as you can.

Using Issues:

During the initial stages (latest by the start of v1.2):

  • Record each of the user stories you plan to deliver as an issue in the issue tracker. e.g. Title: As a user I can add a deadline
    Description: ... so that I can keep track of my deadlines

  • Assign the type.* and priority.* labels to those issues.

  • Formalize the project plan by assigning relevant issues to the corresponding milestone.

From milestone v1.2:

  • Define project tasks as issues. When you start implementing a user story (or a feature), break it down to smaller tasks if necessary. Define reasonable sized, standalone tasks. Create issues for each of those tasks so that they can be tracked.e.g.

    • A typical task should be able to done by one person, in a few hours.

      • Bad (reasons: not a one-person task, not small enough): Write the Developer Guide
      • Good: Update class diagram in the Developer Guide for v1.4
    • There is no need to break things into VERY small tasks. Keep them as big as possible, but they should be no bigger than what you are going to assign a single person to do within a week. eg.,

      • Bad:Implementing parser (reason: too big).
      • Good:Implementing parser support for adding of floating tasks
    • Do not track things taken for granted. e.g., push code to repo should not be a task to track. In the example given under the previous point, it is taken for granted that the owner will also (a) test the code and (b) push to the repo when it is ready. Those two need not be tracked as separate tasks.

    • Write a descriptive title for the issue. e.g. Add support for the 'undo' command to the parser

      • Omit redundant details. In some cases, the issue title is enough to describe the task. In that case, no need to repeat it in the issue description. There is no need for well-crafted and detailed descriptions for tasks. A minimal description is enough. Similarly, labels such as priority can be omitted if you think they don't help you.

  • Assign tasks (i.e., issues) to the corresponding team members using the assignees field. Normally, there should be some ongoing tasks and some pending tasks against each team member at any point.

  • Optionally, you can use status.ongoing label to indicate issues currently ongoing.

Using Milestones:

We recommend you do proper milestone management starting from v1.2. Given below are the conditions to satisfy for a milestone to be considered properly managed:

Planning a Milestone:

  • Issues assigned to the milestone, team members assigned to issues: Used GitHub milestones to indicate which issues are to be handled for which milestone by assigning issues to suitable milestones. Also make sure those issues are assigned to team members. Note that you can change the milestone plan along the way as necessary.

  • Deadline set for the milestones (in the GitHub milestone). Your internal milestones can be set earlier than the deadlines we have set, to give you a buffer.

Wrapping up a Milestone:

  • A working product tagged with the correct tag (e.g. v1.2) and is pushed to the main repo
    or a product release done on GitHub. A product release is optional for v1.2 but required from from v1.3. Click here to see an example release.

  • All tests passing on Travis for the version tagged/released.

  • Milestone updated to match the product i.e. all issues completed and PRs merged for the milestone should be assigned to the milestone. Incomplete issues/PRs should be moved to a future milestone.

  • Milestone closed.

  • If necessary, future milestones are revised based on what you experienced in the current milestone  e.g. if you could not finish all issues assigned to the current milestone, it is a sign that you overestimated how much you can do in a week, which means you might want to reduce the issues assigned to future milestones to match that observation.



F: Handling teamwork issues

If your team is facing difficulties due to differences in skill/motivation /availability among team members,

  • First, do not expect everyone to have the same skill/motivation level as you. It is fine if someone wants to do less and have low expectations from the module. That doesn't mean that person is a bad person. Everyone is entitled to have their own priorities.

  • Second, don't give up. It is unfortunate that your team ended up in this situation, but you can turn it into a good learning opportunity. You don't get an opportunity to save a sinking team every day 😃

  • Third, if you care about your grade and willing to work for it, you need to take initiative to turn the situation around or else the whole team is going to suffer. Don't hesitate to take charge if the situation calls for it. By doing so, you'll be doing a favor for your team. Be professional, kind, and courteous to the team members, but also be firm and assertive. It is your grade that is at stake. Don't worry about making a bad situation worse. You won't know until you try.

  • Finally, don't feel angry or 'wronged'. Teamwork problems are not uncommon in this module and we know how to grade so that you will not be penalized for others' low contribution. We can use Git to find exactly what others did. It's not your responsibility to get others to contribute.

Given below are some suggestions you can adopt if the project work is not going smooth due to team issues. Note that the below measures can result in some team members doing more work than others and earning better project grades than others. It is still better than sinking the whole team together.

  • Redistribute the work: Stronger programmers in the team should take over the critical parts of the code.

  • Enforce stricter integration workflow: Appoint an integrator (typically, the strongest programmer). His/her job is to maintain the integrated version of the code. He/she should not accept any code that breaks the existing product or is not up to the acceptable quality standard. It is up to others to submit acceptable code to the integrator. Note that if the integrator rejected your code unreasonably, you can still earn marks for that code. You are allowed to submit such 'rejected' code for grading. They can earn marks based on the quality of the code.

If you have very unreliable or totally disengaged team members :

  • Re-allocate to others any mission-critical work allocated to that person so that such team members cannot bring down the entire team.
  • However, do not leave out such team members from project communications. Always keep them in the loop so that they can contribute any time they wish to.
  • Furthermore, evaluate them sincerely and fairly during peer evaluations so that they do get the grade their work deserves, no more, no less.
  • Be courteous to such team members too. Some folks have genuine problems that prevent them from contributing more although they may not be able tell you the reasons. Just do your best for the project and assume everyone else is doing their best too, although their best may be lower than yours.