Solutions for Practice Exam 1

Part 1

  1. (a) Correct. Only users or their representatives can determine the correctness of user requirements. It is essential, therefore, to include them in the inspections of the requirements. (b) Feasible. It must be possible to implement each requirement within the known capabilities and limitations of the system and its environment. (c) Necessary. Each requirement should trace back to something the customers really need or something that is required for conformance to an external requirement, an external interface, or a standard. If you cannot identify the origin, perhaps the requirement is an example of "gold plating" and is not really necessary. (d) Prioritized. Priority is a function of the value provided to the customer, the relative cost of implementation, and the relative technical risk associated with implementation. Industry normally uses three levels of priority. High priority means the requirement must be incorporated in the next product release. Medium priority means the requirement is necessary but it can be deferred to a later release if necessary. Low priority means it would be nice to have, but we realize it might have to be dropped if we have insufficient time or resources. (e) Unambiguous. The reader of a requirement statement should be able to draw only one interpretation of it. Also, multiple readers of a requirement should arrive at the same interpretation. Natural language is highly prone to ambiguity. Effective ways to reveal ambiguity include formal inspections of the requirements specifications, writing test cases from requirements, and creating user scenarios that illustrate the expected behavior of a specific portion of the product. (f) Verifiable. Requirements that are not consistent, feasible, or unambiguous also are not verifiable. Any requirement that says the product shall "support" something is not verifiable.
  2. (a) Complete. No requirements or necessary information should be missing. Completeness is also a desired characteristic of an individual requirement. Organize the requirements hierarchically to help reviewers understand the structure of the functionality described, so it will be easier for them to tell if something is missing. It is a good practice to use "TBD" ("to be determined") as a standard flag to highlight any gaps or doubts. Resolve all TBDs from a given set of the requirements before you proceed with construction of that part of the product.
    (b) Consistent. Disagreements among requirements must be resolved before development can proceed. It is important to be careful when modifying the requirements, as inconsistencies can slip in undetected at the time of reviewing the requirements. (c) Modifiable. When revision happens, it is necessary to maintain a history of changes made to each requirement. This means that each requirement be uniquely labeled and expressed separately from other requirements so you can refer to it unambiguously. (d) Traceable. This is to link each software requirement to its source, which could be a higher-level system requirement, a use case, or a voice-of-the-customer statement. Traceable requirements are uniquely labeled and are written in a structured, fine-grained way, as opposed to large, narrative paragraphs or bullet lists.
  3. Specific: without ambiguity, using consistent terminology, simple and at the appropriate level of detail. Measurable: it is possible to verify that this requirement has been met. What tests must be performed, or what criteria must be met to verify that the requirement is met? Attainable: technically feasible. What is your professional judgement of the technical “do-ability” of the requirement? Realizable: realistic, given the resources. Do you have the staffing? Do you have the skill? Do you have access to the development infrastructure needed? Do you have access to the run-time infrastructure needed? Do you have enough time? Traceable: linked from its conception through its specification to its subsequent design, implementation and test.
  4. concurrency, availability, security, performance, fault-tolerance, application distribution & deployment, evolution and re-usability
  5. cost, ease of use, interoperability, portability and throughput
  6. Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design? Easier to learn—operation can be learned by observing the object; Efficiency: Once users have learned the design, how quickly can they perform tasks; Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?; Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?; Satisfaction: How pleasant is it to use the design? More efficient to use—takes less time to accomplish a particular task

Part 2

  1. High quality software meets the needs of the users while being reliable, well-supported, maintainable, portable and easily integrated with other tools. Standards help ensure consistent quality primarily at software development processes.
  2. Product quality is achieved through refinement. It means it takes place in number of iterations. However, standards for software development is important in ensuring the processes are properly followed. For example ISO/IEC 12207:1995 makes sure that all the life cycle processes are adhered to. ISO/IEC 15026 is about ensuring quality at system and software integrity levels. IEEE Std 1219-1992 is for software maintenance. IEE Std 1045-1992 is for developing software productivity metrics
  3. ISO/IEC 9126 is a two-part quality model. Typically, internal quality is obtained by reviews of specification documents, checking models, or by static analysis of source code. External quality refers to properties of software interacting with its environment. In contrast, quality in use refers to the quality perceived by an end user who executes a software product in a specific context.
  4. Implementing product standards is difficult in software engineering partly because product standardisation is gained by the refining process. So much of quality in software engineering is the study of processes. Product standards, for example, focus on two aspects of software products. (1) Software product evaluations(ISO/IEC 14598) (2) Software packages: quality requirements and testing (ISO/IEC 12119:1994). On the other hand, Process standards focus on (1) Software Life-cycle processes (ISO/IEC 12207:1995) (2) System and software Integrity Levels (ISO/IEC 15026) (3) Software Maintenance (ISO/IEC 1219:1992) and (4) Software productivity Metrics (ISO/IEC 1045:1992)


Part 3

  1. The Capability Maturity Model (CMM) is a tool for objectively assessing the ability of an organizational business processes in diverse areas; for example in software engineering, system engineering, project management, software maintenance, risk management, system acquisition, information technology (IT), services, business processes generally, and human capital management. The CMM has been used extensively worldwide in government offices, commerce, industry and software development organizations. There are five levels defined along the continuum of the CMM and, according to the SEI: "Predictability, effectiveness, and control of an organization's software processes are believed to improve as the organization moves up these five levels. (1) Initial (chaotic, ad hoc, individual heroics) - the starting point for use of a new or undocumented repeat process. (2) Repeatable - the process is at least documented sufficiently such that repeating the same steps may be attempted. (3) Defined - the process is defined/confirmed as a standard business process, and decomposed to levels 0, 1 and 2 (the latter being Work Instructions).(4) Managed - the process is quantitatively managed in accordance with agreed-upon metrics. (5) Optimizing - process management includes deliberate process optimization/improvement.
  2. Level 1 - Initial (Chaotic) It is characteristic of processes at this level that they are (typically) undocumented and in a state of dynamic change, tending to be driven in an ad hoc, uncontrolled and reactive manner by users or events. This provides a chaotic or unstable environment for the processes. Level 2 - Repeatable - It is characteristic of processes at this level that some processes are repeatable, possibly with consistent results. Process discipline is unlikely to be rigorous, but where it exists it may help to ensure that existing processes are maintained during times of stress. Level 3 - Defined - It is characteristic of processes at this level that there are sets of defined and documented standard processes established and subject to some degree of improvement over time. These standard processes are in place (i.e., they are the AS-IS processes) and used to establish consistency of process performance across the organization. Level 4 - Managed - It is characteristic of processes at this level that, using process metrics, management can effectively control the AS-IS process (e.g., for software development ). In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. Process Capability is established from this level. Level 5 - Optimizing - It is a characteristic of processes at this level that the focus is on continually improving process performance through both incremental and innovative technological changes/improvements.
  3. (a) Iterative software development, (b) Quality as an objective, (c) Continuously verification of quality, (d) Customer requirements, (e) Architecture driven, (f) Focus on teams, (g) Pair programming, (h) Tailoring with restrictions, (i) Configuration and Change Management, (j) Risk management
  4. XPs may be well applied to smaller projects where the approach for simple and fast development on delivering high quality as it focuses on software quality development. XP support quality in three fronts: first of all, XP requires that the customer is on site and involved in the iteration planning process. Secondly the short iterations force the project team to develop functional releases at the end of each iteration to pass acceptance testing. Thirdly, the focus on teams together with the pair programming principle forms a very effective team approach, another key strength of Extreme programming.
  5. (1) Assuring an acceptable level of confidence that the software will conform to functional technical requirements. (2) Assuring an acceptable level of confidence that the software will conform to managerial scheduling and budgetary requirements. (3) Initiation and management of activities for the improvement and greater efficiency of software development and SQA activities.

Part 4

  1. In the field of computer science, an interface is a tool and concept that refers to a point of interaction between components, and is applicable at the level of both hardware and software. This allows a component, whether a piece of hardware such as a graphics card or a piece of software such as an internet browser, to function independently while using interfaces to communicate with other components via an input/output system and an associated protocol. Modular programming, first of all, is a software design technique that increases the extent to which software is composed of separate, interchangeable components, called modules by breaking down program functions into modules, each of which accomplishes one function and contains everything necessary to accomplish this. Secondly, modules represent a separation of concerns, and improve maintainability by enforcing logical boundaries between components. Thirdly, modules are typically incorporated into the program through interfaces. A module interface expresses the elements that are provided and required by the module. The elements defined in the interface are detectable by other modules. The implementation contains the working code that corresponds to the elements declared in the interface.



  2. Data coupling occurs between two modules when data are passed by parameters using a simple argument list and every item in the list is used. Here is an example of object-oriented data coupling in the object message diagram. It illustrates that messages sent between a Rental Agent object and a CarReservationService object to reserve a Car (the Car object is an argument passed in the messages shown in the object message diagram).
datacoupling.jpg


Part 5

  1. (a) Faulty requirements definition; (b) Client-developer communication failures; (c) Deliberate deviations from software requirements; (d) Logical design errors; (e) Coding errors; (f) Non-compliance with documentation and coding instructions; (g) Shortcomings of the testing process; (h) User interface and procedure errors; (i) Documentation errors
  2. (1) Strive for consistency - Consistent sequences of actions should be required in similar situations; identical terminology should be used in prompts, menus, and help screens; and consistent commands should be employed throughout.(2) Enable frequent users to use shortcuts - As the frequency of use increases, so do the user's desires to reduce the number of interactions and to increase the pace of interaction. Abbreviations, function keys, hidden commands, and macro facilities are very helpful to an expert user. (3) Offer informative feedback - For every operator action, there should be some system feedback. For frequent and minor actions, the response can be modest, while for infrequent and major actions, the response should be more substantial. (4) Design dialog to yield closure - Sequences of actions should be organized into groups with a beginning, middle, and end. The informative feedback at the completion of a group of actions gives the operators the satisfaction of accomplishment, a sense of relief, the signal to drop contingency plans and options from their minds, and an indication that the way is clear to prepare for the next group of actions.(5) Offer simple error handling - As much as possible, design the system so the user cannot make a serious error. If an error is made, the system should be able to detect the error and offer simple, comprehensible mechanisms for handling the error. (6) Permit easy reversal of actions - This feature relieves anxiety, since the user knows that errors can be undone; it thus encourages exploration of unfamiliar options. The units of reversibility may be a single action, a data entry, or a complete group of actions. (7) Support internal locus of control - Experienced operators strongly desire the sense that they are in charge of the system and that the system responds to their actions. Design the system to make users the initiators of actions rather than the responders.(8) Reduce short-term memory load - The limitation of human information processing in short-term memory requires that displays be kept simple, multiple page displays be consolidated, window-motion frequency be reduced, and sufficient training time be allotted for codes, mnemonics, and sequences of actions.
  3. The two measurable attributes of quality are Simplicity & Modularity. Simplicity may be achieved when the design meets the objects and has no extra embellishments. Simplicity may be measured by counting the number of paths through the program (control flow complexity), the number of data items shared (information flow complexity) and the number of differrent identifiers and operators (name space complexity). Secondly, modularity may be achived when different concerns withi the design have been seperated. Modularity may be measured by looking at how well component of a module go together (cohesion) and how much different modules have to communicate (coupling)
  4. Ideas such as reliability, complexity and usability are some abstract notions of quality properties. The first step of turning them into measurables is to define some metrics. For example, realiability may be measured by studying the mean time of system failures, if any. Complexity may be measured by defining the information flow between modules. Usability may be studied by defining the time taken to for someone to learn a task.