36-01209 Rev. A
Massachusetts Institute of Technology
Center for Space Research
Covers MM 8075.1 - DM02, DM04 and DM05
May 21, 1995
Table of Contents
The AXAF-I CCD Imaging Spectrometer (ACIS) Science Instrument Software is being developed by the Massachusetts Institute of Technology, Center for Space Research (MIT-CSR) as part of the ACIS Digital Processor Assembly (DPA). The DPA resides on-board the Advanced X-ray Astrophysics Facility -Imaging (AXAF-I). The DPA Science Instrument Software is responsible for acquiring, and processing image data from the ACIS CCD Imaging Spectrometer and transferring the processed data to the AXAF-I Command and Telemetry Unit (CTU), which is then responsible for sending the information to the ground.
The ACIS Science Instrument Software Development Plan details the major activities, schedules, resources and milestones for developing the ACIS Science Instrument Software.
This document applies to the Science Instrument Software provided with the ACIS Digital Processor Assembly. It does not provide information for the Ground Support Software (GSS), which is maintained separately as part of the Electrical Ground Support Equipment (EGSE).
This document supplies information applicable to the Software Development and Software Standards and Procedures portions of SMA01-P. This document conforms to the content and format requirements of MM 8075.1 - DM02, DM04 and DM05.
By mutual agreement, MSFC Software Management and Development Requirements Manual MM8075.1, which supersedes MA-001-006-2H, forms the basis for this plan.
Once delivered, this software will be operated and maintained as described in the ACIS Maintenance Specification and Operators Manual.
Part Number Version Title
MSFC MM 8075.1 January 22, 1991 MSFC Software Management and Develop
ment Requirements Manual
MIT-CSR 36-01201 Rev. 01 ACIS Program Management Plan
MIT-CSR 36-01208 Rev. 02 ACIS Science Instrument Software Manage
MIT-CSR 36-01212 Rev. 01 ACIS Software Quality Assurance Plan
- - ACIS Master Schedule
MIT-CSR Part # to be assigned - ACIS Science Instrument Software Mainte
MIT-CSR Part # to be assigned - ACIS Science Instrument Software Operators
NASA Reference Publication September, 1993 Mongoose ASIC Microcontroller Program
1319 ming Guide
ISBN 0-8053-5340-2 Second Edition Object-Oriented Analysis and Design with
Applications, by Grady Booch. Benjamin/
Cummings Publishing Co, Inc., 1994
ISBN 0-07-050814-3 Third Edition Software Engineering, A Practitioner's
Approach, by Roger S. Pressman. McGraw-
Hill, Inc., 1992
The ACIS Science Instrument Software will be developed using an iterative development cycle and Object-Oriented design (OOD) and implementation (OOP) methodologies. OOD methods are derived from modular design techniques, providing the additional capability to classify problem and design elements according to similarity of representation, function or interface. Object-Oriented implementation tools allow this classification to be incorporated directly into code. The use of OOD helps the developers map the problem, as expressed by the scientists, into the design of the system. Its use also helps identify and encapsulate the hardware and software interfaces of the system by allowing a direct mapping of the interface elements into corresponding software items. Different features of the system can be described using interactions between cooperating instances of these items.
The development cycle consists of an initial requirements stage, followed by iterations through analysis, design, implementation, and test stages. Periodic internal and external releases of documentation and software will be made to support incremental integration with the hardware, and incremental testing against requirements. All releases will be subject to Software Quality Assurance (SQA) audits. Once all of the Software Requirements have been addressed and tested on the hardware, the final Science Instrument Software will be released for formal Verification and Validation. Changes to the Science Instrument Software after release for Verification and Validation are strictly controlled as indicated in the ACIS Software Management Plan. Further development activity of the Science Instrument Software will be restricted to problems reported from Verification and Validation.
The following defines the major development, integration and testing activities:
In order to support the Science Instrument Software Development, the following activities are needed:
The Analysis development activity involves refining the Software Requirements given the available interfaces and systems. The result of this process is a refined understanding of the problem, including a set of features and derived requirements needed to meet the higher level requirements.
The Design development activity involves defining, developing and documenting the software mechanisms which satisfy the Software Requirements, including the features and derived requirements resulting from the Analysis activity.
The Implementation development activity involves acquiring or implementing each software mechanism defined by the Design activity.
The Testing development activity involves running and comparing the implemented software with the design, the derived requirements and features, and the top level requirements.
The Science Instrument Software development requires a software development manager, lead software engineer and at least two other software development engineers. Each person on the team requires their own workstation, interconnected via a Local-Area-Network (LAN) to a large, shared file-system. Maintenance of the workstations, LAN and file-system will be coordinated with other ACIS development groups and MIT-CSR systems administration.
- Build Management
This activity involves maintaining the Science Instrument Software source control area and development/maintenance of the Science Instrument Software build scripts and makefiles. This activity also involves managing internal and external releases of the software and its documentation.
- Tool Management
This activity involves defining, acquiring or developing, evaluating and maintaining tools needed for the development of the Science Instrument Software. Such tools include: a source control system, version controlled build system, unit testing tools, documentation tools, etc.
- COTS Management
This activity involves researching Commercial or Free Off-The-Shelf software (COTS) which may address the development needs of the Science Instrument Software. This activity also involves the acquisition, evaluation and maintenance of identified software.
- Internal Design Reviews
This activity involves reviewing the Science Instrument Software design and the design of its components. This activity evaluates the design against the requirements, and the Software Management, Development and Quality Assurance Plans. It also reviews the design for any implementation or testing difficulties. The design documentation will also be reviewed for clarity and usability, as well as accuracy, completeness, and correctness.
- Internal Code Walk-through
This activity involves reviewing the implementation of the Science Instrument Software components. This activity evaluates the developed code against the design. It also reviews the implementation and comments for clarity and maintainability, as well as compliance to coding standards (refer to Section 10.5 on page 23).
In order to support COTS acquisition, evaluation and support, access from the workstations to the Internet will be available. Internet access also facilitates early software protocol testing with other groups involved with AXAF-I.
In order to support hardware integration tests, the Software Development Team will have access to breadboard and/or brassboard versions of the DPA hardware. The team also will have access to any Ground Support Equipment and Government Furnished Equipment required to use the DPA hardware, prior to the Implementation Phase of the software (see Section 5.5 on page 11).
This section defines the Science Instrument Software development activity flow in terms of time-periods between major software milestones. Since the software is being developed iteratively, each time-period is subdivided into the major activities listed in Section 3.1 on page 5. Unless otherwise noted, all of the described activities apply to the Software Development Team.
The development activities are grouped into coarse phases, oriented around the major software milestones. All of the major analysis, design, implementation and test activities occur during all but the initial and ending phases, but the emphasis on each activity changes over the course of the software development. The following table lists the major development phases and milestones:
TABLE 1. Development Phase to Milestone map
Phase Starting Ending
Requirements Phase SRR SWRA
Preliminary Design Phase SWRA SWPDR
Design Phase SWPDR SWCDR
Implementation Phase SWCDR SWTR
Test Phase SWTR FCI
Systems Integration/Sup CI AR
Prior to the Software Requirements Audit, most of the development effort will be spent preparing the Software Requirements Specification, and management and policy documentation.
The Software Management Plan, Development Plan, Requirements Specification, Quality Assurance Plan, are developed during this phase. All documents except the Software Quality Assurance Plan will be written by the Software Development Team. The Software Quality Assurance Group (SQA) will author the Software Quality Assurance Plan.
After the Software Requirements Audit, the Software Requirements are assumed to be stable. Work may continue on the requirements up to the Software Preliminary Design Review, after which the requirements will be placed under Configuration Control.
Between the Software Requirements Audit and Software Preliminary Design Review, most of the analysis effort will be spent developing and reviewing the Interface Control Documents (ICDs), and deriving the top level functions and requirements needed to satisfy the main Software Requirements. Information from this effort will be provided in the Preliminary Software Design Specification.
The Design effort at this stage of the development cycle will mostly involve establishing the main mechanisms of the software. By SWPDR, almost all of the major interfaces and software mechanisms will be identified and documented. The detailed operation and physical packaging of these mechanisms will have started, but will be in a fairly immature state. The results of this part of the design stage will be provided in the Preliminary Design Specification.
Between SWRA and SWPDR, most of the implementation work will involve building preliminary versions of the critical and high-risk software elements. All Commercial or Free Off-The-Shelf software (COTS) used in the Science Instrument Software will be identified, acquired, and evaluated.
Between SWRA and SWPDR, testing will involve the evaluation of all COTS to be used in the Science Instrument Software and any critical or high-risk software elements of the design. Results of these tests shall be made available to SQA for audit.
The results of this phase of the development will be incorporated into the Software Preliminary Design Specification and an initial version of the Software Fault Tolerance and Failure Modes and Effects Analysis Specification. SQA will author the Software Test Plan during this phase.
Between SWPDR and Software Critical Design Review, the Analysis effort will focus on how well the in-progress design meets the Software Requirements and users' needs. Refinement of the functions and derived requirements will be at a minimum.
Between SWPDR and SWCDR, the design effort will transition from top level mechanism definitions to the detailed design of each mechanism. The packaging of each source code item will be defined and work on build procedures and maintenance issues will be underway. During this stage, the detailed design of the units responsible for managing the software-to-hardware interfaces will be completed. The results of this part of the design will form the basis for the Software Detailed Design Specification (CODE-TO) and the Software Maintenance Specification. By SWCDR, all design documentation will be released.
By SWPDR, some of the critical design elements will have been informally prototyped. Between SWPDR and SWCDR, these items will be completed and reviewed. Implementation of the major software-to-hardware interfaces will start. This will allow gradual integration with the hardware as prototype and/or brassboard hardware becomes available.
During the time between SWPDR and SWCDR, most of the testing activity will concentrate on the unit-level tests of any complete or partially complete code. Inter-unit software integration tests will begin as soon as units become available and have been unit-tested. Gradual software-to-hardware interface testing will begin as soon as hardware and software become available. Regression unit testing will be performed as new features are added to existing code. The versions of the source being tested, the unit test software and report results will be identified.
Preliminary versions of the software with certain minimum testable features will be made available to SQA and MIT-CSR science for preliminary evaluation during this phase.
The results of this phase of the development will be incorporated into the Software Detailed Design Specification, the Software Maintenance Document and an updated version of the Software Failure Modes and Effects Analysis Specification. SQA will provide the Software Status/Problem Report Plan, the Software Verification Test Specification, and the Software/Systems Acceptance Test Specification.
Unit and integration test reports (including the versions of the software being tested) performed during this phase will be recorded electronically (unit name aliases) or physically (unit development folders) and made available to SQA for audit.
Initial information about the design, implementation and test of versions of system features will be recorded electronically (feature name aliases) or physically (feature folder) and made available to SQA for audit.
After Software Critical Design Review, the development's Analysis effort should be complete.
Between SWCDR and SWTR, most of the design effort will be in response to information provided during the implementation and test of the design.
Between SWCDR and SWTR, the implementation will complete all of the designed units. By SWTR, all required features will have been implemented.
Between SWCDR and SWTR, unit, software integration, and hardware integration tests will be performed as physical units are completed. Once all of the features have been incorporated, system level testing will be performed by the development team. At SWTR, a major internal release of the software will be made to Software Quality Control and the science branch in preparation for Verification and Validation testing.
During this phase, changes to the design will be provided as updates to the Software Detailed Design Specification and a preliminary version of the Software Operators Manual will be made available for evaluation. Software release notes will be provided for internal software releases. All unit, integration test code, documentation and reports will be recorded electronically (appropriately aliased to the unit name) or physically (in a unit development folder) and made available to SQA for audit. Completions and/or partial completions of system features, including a list of participating units, test programs, scripts and data, shall be recorded electronically (appropriately aliased to a feature name) or physically (in a feature folder) and made available to SQA for audit.
During the Implementation Phase, several internal releases will be made to allow incremental testing of features as they are added to the system. Releases will be made every 1 to 2 months during this phase. The criteria for a release is the completion of a feature of the system (such as a Science Mode), with the software elements unit tested, code reviewed, and integration tested. Subsequent internal releases may add functionality to previously provided units. If a unit is modified, its unit test will be repeated. Any modified interfaces or dependencies will also be retested prior to the subsequent release. Since we hope to automate the execution of most of the unit and integration tests, this re-testing process should not appreciably impact the development time. SQA will audit and monitor these internal releases.
Between the Internal Software Release and the Preliminary Software Release, all development activity will focus around updating the design and test documentation and providing support and fixes during internal Verification and Validation testing. During this period, the Science Instrument Software will be tested as described in the Software Test Plan. Before providing the Preliminary Software Release, all released software will be placed under configuration control.
After the Preliminary Software Release, all development activity will involve completing the Software Operators Manual, updating documentation and providing support for Systems Integration, Operations and Maintenance.
The documentation tree and requirements traceability for the specifications listed in the ACIS Science Instrument Software Management Plan, is derived from those described in MM 8075.1, Section 1.2.2, Figure 1-2.
Figure 1 below illustrates the flow of requirements within the documentation from the ACIS Statement of Work down to the Operators Manual and Test Reports. TBD part number digits are indicated by an "x."
In order to reduce the development cost and increase reliability, the ACIS Science Instrument Software will use Commercial (and non-commercial) Off-The-Shelf Software (COTS) where needed and appropriate. This section describes the types of COTS expected to be used by the Science Instrument Software. If appropriate COTS cannot be found for the identified items, the required functions will be developed in-house. All selected COTS must be compatible with the Science Instrument Software's run-time environment, resource constraints and error handling and recovery techniques (i.e. COTS which attempt to terminate a program upon error are a bad choice).
The ACIS Science Instrument Software will use a commercially available real-time operating system (OS). Candidate operating systems must have the following features:
In addition to the above list, the following features are desirable, but not essential:
- A stable, well-known vendor
- Fixed-priority, preemptive task switcher
- Low-level interrupt handling capability (implied by the previous requirement)
- Task signalling, both from tasks and from interrupt handlers
The ACIS Science Instrument Software may use commercially available class and function libraries. Candidate libraries may include the following types of operations:
- Inter-task message queues (or equivalent)
- Some memory management services
- Device-driver interface standards and/or hooks
- Unix-like library interfaces to the operating system
- OS simulation which runs on DECstations under Ultrix (the production software must run on a Mongoose)
- Object-Oriented Programming OS support (such as an OS interface class library)
- On-line support via the Internet
The ACIS Science Instrument Software will be developed using a small development team. Integration of internal Science Instrument Software units will be performed by this team. Science Instrument Software unit integration will be performed incrementally, as units are developed and unit tested.
- Collections (i.e. arrays, lists, trees, etc.)
- Error Detection and/or Correction Techniques (CRC, Reed-Solomon, etc.)
- Support functions not provided by OS (semaphores, message queues, etc.)
Interfaces between units are specified and controlled using the Detailed Design Specification. First, the interfaces are defined or modified in the Detailed Design Specification. The specification is then distributed for review by the affected developers. After the design has been reviewed, the described functions are created or modified. The created or modified functions are then unit tested and integrated. Any given unit may contain more than one function, and therefore, may be unit tested and re-integrated more than once.
Integration tests will be developed using the Detailed Design Specification as a guide. Evaluation of the tests will be performed by the developers, under the direction of the lead software engineer.
This section describes the selection of the High Order Language used to develop the ACIS DPA Science Instrument Software. This section satisfies some of the requirements for the High Order Language Trade Study (DM05) described in MM 8075.1.
The ACIS Science Instrument Software will be developed using three languages; C++, C, and Assembly Language. C++ will be the primary implementation language, and C and Assembler will be used as warranted by performance and interface constraints. C++ and C were selected from a list using the following criteria, largely driven by the design approach and the in-house expertise:
The considered languages were:
- In-house expertise
- Compatibility with design approach
MIT-CSR has used C as the implementation language for previous flight instruments and has used C++ for ground support software, instrument support software, and science software. The ACIS Science Instrument Software development team has had experience with both languages.
The development team has had limited experience with Objective-C, Modula-3, and Pascal. No one on the development team has had any experience with Ada beyond comparisons made to other languages in design textbooks.
C++, Objective-C and Modula-3 are good languages with which to implement an Object-Oriented Design (OOD). Ada is not strictly an object-oriented language, but has certain features which can support an object-oriented design. Traditional non-object oriented languages, such as C and Pascal, can be used to implement an OOD, but tend to require much more work and are prone to more errors. Since C++ and Objective-C are super-sets of C, any non-object oriented aspects of the design can still be implemented in these languages. The designs of Modula-3 and Ada also easily handle non-object oriented types of operations.
Modula-3 is the safest language considered. It provides extensive type checking, array bounds checking, built-in exception handling, and many other safety features. These features, however, have a cost in terms of run-time performance (see Section 9.6 on page 17). Ada and Pascal are also considered safe languages.
C++ is a structured, strongly type-checked language and provides compile-time checks on a variety of operations. C++ is not a strictly safe language in that it does not provide array boundary checking, strict pointer checking or some of the other safety features of Modula-3. As such, C++ produces faster run-time code, but requires more rigor during code review and testing. C++ has much stronger encapsulation semantics (such as read-only methods that guarantee no modification of the underlying class) than does ANSI C.
Objective-C, like C++, is not a completely safe language. Also, unlike C++, the object-oriented aspects of Objective-C are untyped. Messages can be sent at run-time to objects which don't understand the message. This is very flexible when performing operations on collections of anonymous objects, but can cause unexpected run-time errors, and is, therefore, likely to be more difficult to test.
C++ is the most widely used object-oriented language in the commercial software industry. C++ to C translators are available and C compilers are available for most common processor architectures. Several Computer-Aided Software Engineering (CASE) tools explicitly support the language, and access to various language support tools is growing.
An Objective-C to C translator and libraries are available from Stepstone Corp. An Objective-C compiler and some run-time libraries are also freely available from the Free-Software Foundation (gcc).
Modula-3 is freely available from Digital Equipment Corporation (DEC) via the Internet. It is provided in the form of a Modula-3 to C translator. DEC provides several tools and libraries, implemented in Modula-3 and C which support the language. Support for the language from vendors other than DEC is limited.
Ada is commonly used in the defense industry and is strongly supported for embedded systems.
Pascal is available for several processor architectures and from several vendors, although support for embedded systems seems limited.
C++ is the fastest of the OOP languages considered. Static method bindings are translated into direct function calls. Dynamic binding is accomplished using indirect function calls, possibly via an indexed table. C++ does not require nor provide any automatic garbage collection.
Modula-3's run-time checks and automatic garbage collection lend a negative impact to its run-time performance. These features can be selectively disabled to improve performance at critical points, but tend to impact the safety at those points and thus imposes a more complex testing methodology.
Objective-C message passing is dynamically bound. The binding mechanism uses run-time table lookup sequence which tends to slow down the performance of the language. This effect can be minimized by implementing time-critical portions of code in C and accessing them using normal function calls.
The performance of Pascal and Ada were not explicitly researched.
The following matrix ranks each language from 1 to 5 in each category:
TABLE 2. High-Order Language Evaluation Matrix
Language In-house Design Safety Availability Performance Totals
C++ 4 4 3 4 4 19
Objective-C 2 4 2 3 2 13
Modula-3 2 4 4 2 2 14
Ada 1 3 4 3 3 14
Pascal 2 2 4 3 3 14
This section describes the Software Standards and Procedures for the ACIS Science Instrument Software. This section satisfies the requirements for the Software Standards and Procedures Plan (DM04) described in MM 8075.1.
The Software Standards and Procedures makes the following assumptions about the ACIS Science Instrument Software development:
The ACIS Flight Hardware will be using the Mongoose ASIC Microcontroller for all on-board CPUs. Refer to NASA Reference Publication 1319 for a detailed description of this processor. The Mongoose is a Radiation-Hard Microcontroller, which executes a MIPS R3000 compatible instruction set. There are some architectural differences between the Mongoose and a standard R3000 (such as no hardware floating point unit in the Mongoose), but none of these differences affect the core instruction set of the processor. This allows a compiler built for the R3000 to be used for the Mongoose. The differences will, however, affect the low level aspects of an operating system, such as some interrupt assignments and virtual memory management (the Mongoose does not use the Translation-Look-Aside buffer). Additionally, the Mongoose has an on-chip DMA controller, two timers, one of which can be used as a Watchdog timer, a serial port for use when debugging, and some other features which assist hardware designers.
- The Science Instrument Software will be designed using an Object-Oriented Design approach
- The Science Instrument Software will be implemented using C++, C and Assembly Language
- The Mongoose (LT10181) will be the core processor of the Digital Processor Assembler (DPA)
- The Software Development Team will be small (i.e. less than 5 developers)
The ACIS Science Instrument Software will be designed and developed on DECstations and Sun SPARCstations, interconnected on a Local Area Network (LAN). The LAN will be connected to the MIT campus network, and to the Internet. MIT-CSR currently has both types of machines.
All machines will use the X-windows user interface.This allows programs to be run on one machine, and have the program's user interface available on another machine. This capability allows users of DECstations to easily run programs on the SPARCs and visa-versa.
Since the DPA will be using the Mongoose as its core processor, and since the Mongoose's instruction set is compatible with the DECstation processor, certain performance tests and integration tests can be performed on the DECstations before DPA hardware is available. The DECstations also provide access to several R3000 compilers and linkers. Unfortunately, Digital Equipment Corporation (DEC) is replacing the DECstations with Alpha-based machines. As a result, there is less new software being developed for the DECstations. Since MIT-CSR is already using SPARCstations for other parts of the ACIS development, and it is supported by most Unix-based software developers, the SPARCstation is a good platform for running the design and documentation tools.
Although the SPARCstations will be used to create design and documentation information, whereas the DECstations will be used to compile, link and debug code, the Science Instrument Software development does not require binary file transfers between the two machine types. However, if test or maintenance software is developed to run on the SPARCstations, byte-ordering issues must be taken into account. The ACIS Science Instrument Software does not expect to use floating point. Therefore, differences in floating-point representations between DECstations and SPARCstations will not be an issue.
The ACIS Science Instrument Software will be developed using the following software. Refer to Section Section 11.0 on page 26 for detailed descriptions of these tools:
TABLE 3. Software Tools
Tool Type Alternates Platform Description
Framemaker Desktop Pub Interleaf SPARC Documentation. Many design
lishing Islandwrite HP-AUX tools support output to Frame
Mac maker formats. This makes docu
MS-Win mentation of the design much
Rational Rose CASE Tool Cadre Object SPARC CASE. Rational Rose is an Object-
Team Oriented Computer-Aided Soft
ware Engineering (CASE) Tool. It
supports Booch notation (Grady
Booch is the chief scientist at
Rational). Current versions can
output design diagrams and speci
fications to Framemaker. Newer
version can produce C++ header
files and source code templates.
gcc C++-compiler cfront with: DEC or Compiler. GCC is a freely distrib
BSO/Tasking SPARC utable compiler from the Free
tools or the Software Foundation. It is capable
DECstation of compiling C++, Objective-C,
compiler and C programs. It can be built as
a native compiler, or as a cross-
compiler (i.e. makes transition to
other development platforms
easy). Will need some develop
ment support in the area of an
ACIS run-time library.
gdb Debugger BSO/Tasking Any Source Debugger. GDB is a freely
Tools distributable source-code debug
ger which supports both C++ and
C. It can be used as a remote
debugger and can be modified to
support various types of remote
LSI ROM HW Debug BSO/Tasking Target Monitor. The ROM Monitor
Monitor Tools Hardware needs to support breakpoints, reg
ister reads/writes and memory
reads/writes. It must be compatible
with the selected source-level
The Science Instrument Software will be developed on an interactive Local Area Network (LAN). The network will provide easy access to the Internet.
In order to ensure initial access to currently available design and documentation tools, the development team will use Sun SPARCstations, running SunOS, for part of the development effort. As of this writing, all major software purchases shall be for the SPARCs. Migration from SPARCstations to other platforms during the development process will be allowed, given changes in the supported tool set.
Since the DPA uses an R3000 compatible processor, DECstations will be used for the implementation and unit and inter-unit integration testing. This provides a number of currently available low level development and performance measurement tools. Formal hardware integration testing will be performed on DPA breadboard and brassboard hardware, using TRW supplied command and telemetry unit interface simulators. This equipment must be available prior to the Testing Phase, which starts with the Software Test Review. Final integration testing will be performed on the DPA flight hardware.
The Software Standard and Procedures specified in this section apply only to the in-house development of the ACIS Science Instrument Software which resides on the Digital Processor Assembly (DPA). These standards also apply to all software device drivers developed in-house. These standards do not apply to unmodified COTS.
The ACIS Science Instrument Software will be developed using an Object-Oriented Design approach, as specified in "Object-Oriented Analysis and Design" by Grady Booch. Typically, Object-Oriented designs enforce qualities such as modularity, encapsulation (information hiding), although the misuse of class inheritance can counteract these qualities. Object-Oriented designs also clearly define the scope of individual class tests. Integration tests can be identified in the design by inter-class and inheritance relationships. Systems tests, however, should be driven by the requirements and interface analysis results, rather than from the design.
Typically, inheritance between two classes should only be used when the child class is a refinement of the higher level class (or classes). The "rule of thumb" is that, if the sentence, "A is a kind of B," makes sense and is clear, then class "A" can be a subclass of "B". If the sentence, "A is a part of B," is clearer, then "B" contains "A." In this case, a "using" relationship should be used instead of an inheritance relationship. Booch provides more detail as to when to use these types of relationships.
Because changes to high level base classes affect all child classes, and may require changes to all of the child classes, deep inheritance trees (greater than about 7 ancestors) should be avoided. Typically, deep class hierarchies are useful when the classes, relationships and properties are extremely well understood and mature. Shallow hierarchies tend to re-use less code than deeper trees, but are much less volatile when changes are made to the higher level classes.
Given the limited amount of memory available in the DPA, and the strong desire to keep unrelated classes as decoupled as possible, memory allocation from a global memory heap after system initialization will be avoided. This minimizes the possibility that the behavior of one part of the system causes an out-of-memory failure in another, unrelated part of the system. The general rule is that all global memory allocations will be made during the system initialization stage, and all subsequent dynamic memory management will be provided by class-specific mechanisms (NOTE: C++ can support this behavior by providing mechanisms for class-specific new and delete operators).
In order to ensure that the design can meet temporal requirements, analysis models and prototyping will be performed to evaluate the performance capabilities of the design. The design must identify and characterize any time-critical classes and methods.
Informal and formal reviews of the design will be held to ensure that the design meets the requirements, conforms to the defined interfaces, and can be implemented, tested, and maintained. Informal reviews will be held on an ad-hoc basis by the software development team during analysis and design stages of any given part of the system. Formal internal design reviews will be held prior to the release of a given design document. The Preliminary Design Review (PDR) and Critical Design Review (CDR) comprise the formal external reviews and are defined by MSFC.
There are no documented coding style standards. The coding style of
each developer will be subject to peer review and comment, and adjusted
for clarity, and maintainability.
Code naming conventions are defined in the Software Detailed Design
Source files shall contain at least the following:
Structure, enumeration, and union definitions shall contain at least the following:
- Brief description of the purpose of the file
- A change history of the source file
- A source control string identifying the source control file and version
Class definitions shall contain at least the following:
- Brief description of the structure, enumeration or union
- Brief description of each member of the structure, enumeration or union
Global shared writable data and object declarations shall contain at least the following:
- Brief description of the class and its purpose
- Brief description of each of the class member variables and functions
Private data and object declarations shall contain at least a brief description of the item.
- Brief description of the item.
- Brief description of the users of the item.
Member function implementations shall contain at least the following:
In order to ensure testability, a single member function shall not have a McCabe's Complexity Index greater than 10 without justification. See "Software Engineering, A Practitioner's Approach" by Roger Pressman for techniques used to determine this index.
- Brief description of the member function and its purpose
- Brief description of the function's arguments and return values
- Brief description of all unusual and error conditions
- Brief description of the operation of the function (using PDL, structured English, or unstructured English as appropriate).
- Brief description of all local variables, excluding loop counters and temporary holding variables.
- Brief description of each statement block (such as branches, loops, nested blocks)
In order to minimize certain types of errors, enforce coding standards, and to ensure that the implementation can be maintained, developed source code shall undergo code walk-throughs by at least two software developers other than the code's author. These walk-throughs will be audited by Software Quality Assurance (SQA).
All implementations shall be unit tested, such that each line of source code is executed at least once. Unit test coverage and results will be documented, saved, and will be audited by SQA. When possible unit testing will be performed by writing a program or script to exercise and test the unit. The resulting program/script will be saved under version control and used to perform regression tests if and when the unit is modified. When a program or script cannot be used to test the unit, the testing will be performed by hand, and the steps used for the test, and results of the test will be documented and saved under version control and will be audited by SQA.
Each unit will be integrated after it has been unit tested. All major
relationships and side-effects defined by the detailed design will be
tested during unit integration. Test results will be documented by
software development and
audited by SQA. When possible, integration testing will be performed by
writing a program or script to exercise and test the groups of units.
The resulting program/script and any data needed for input or
comparison will be saved under version control and used to perform
regression tests if and when an interface or dependency between the
units is modified. When a program or script cannot be used to perform
the test, the testing will be performed by hand, and the steps used for
the test, any data used for the test, and results of the test will be
documented and saved under version control and will be audited by
Units (or groups of units) responsible for managing hardware interfaces will be tested on breadboard or brassboard DPA hardware after being unit tested. These tests will exercise each of the hardware features used by the given unit. Test results and version of the hardware used will be documented and audited by SQA. When possible, hardware integration testing will be performed by writing a program or script to exercise and test interfaces. The resulting program/script and any needed data needed for input or comparison will be saved under version control and used to perform regression tests if and when an interface or dependency changes. When a program or script cannot be used to perform the test, the testing will be performed by hand, and the steps used for the test, any data used for the test, the version of the hardware tested, and results of the test will be documented and saved under version control and will be audited by SQA.
The ACIS software will be independently verified, prior to formal integration with instrument flight hardware. The verification procedures shall ensure the correct implementation of each feature described in the Software Requirements Specification. The detailed verification standards are described in the ACIS Software Test Plan.
The validation of the ACIS software will be performed as part of the overall instrument verification and validation process, and has no separate set of validation standards.
Documentation will be produced in accordance with the applicable Data Requirement specified as part of MM 8075.1, or, except when otherwise directed, to some published standard (such as IEEE).
Refer to the ACIS Science Instrument Software Management Plan, and the ACIS Configuration Management Plan for a description of configuration management standards.
This section describes the development tools the ACIS Science Instrument Software development team will use to design, implement and test the ACIS DPA Science Instrument Software.
Some of the tools described in this section are provided by the Free Software Foundation (FSF). FSF has been providing high-quality software, free-of-charge, for at least the last six years. Although these tools are available in source form and free-of-charge, support for these products is available from several vendors. If the need arises during the development of the Science Instrument Software, the development team will use the services of Cygnus Support (Appendix A provides further information about Cygnus Support).
All commercial and non-commercial software tools which directly affect the DPA Science Instrument Software image (such as a compiler) will be placed under version control prior to CDR. If source code is provided with a given tool, either each source module will be placed under version control, or an archive of the directories, files and documentation provided with the tool will be placed under version control.
Parts of the ACIS DPA Science Instrument Software will be designed using a Computer-Aided Software Engineering (CASE) design tool. Given that the design effort will be using an Object-Oriented Design (OOD) approach, the candidate tools must support OOD methodologies. The development team is expecting to use "Booch" notation for all design diagrams, and the candidate tools must support this notation.
Currently, two commercial tools are being considered:
Cadre ObjectTeam is a high-end CASE tool whose capabilities were being evaluated.
- Cadre ObjectTeam
- Rational Rose/C++
Rational Rose/C++ is less expensive than ObjectTeam. It is intended to be used as a design tool and supports "Booch" notation and methods (Grady Booch is a chief scientist at Rational). The latest version of the tool is capable of producing a variety of diagrams, C++ headers and member function templates, and design documentation information. MIT-CSR has had some experience using an earlier version of this tool which was capable of producing diagrams and specification templates, but did not provide explicit coding support. The X-windows/SPARCstation version started shipping in June/July 1994, but Windows versions have been available for some time.
The CASE tool selection is Rational Rose/C++.
The Science Instrument Software development team will use Revision Control System (RCS) for software version control. RCS is provided by the Free Software Foundation and has been used within MIT-CSR for science instrument software development for at least three years.
In addition to RCS, the development team will use Concurrent Version System (CVS) as a front-end to RCS. CVS extends source control management from one release directory to a hierarchy of directories. It also provides aliasing capabilities to assign a single name to collections of files and directories.
The Science Instrument Software development team will use either the standard version of "make" supplied with the development workstations (DECstation or SPARCstation), or will use GNU Make, provided by the Free Software Foundation. GNU Make has been in use at MIT-CSR for science software development for the last year. GNU Make provides explicit support for RCS.
The current selection is GNU Make.
The Science Instrument Software will be developed in C++, C and Assembler (see Section Section 9.0 on page 15). Two compiler approaches are being considered:
The GNU C++ compiler is provided by the Free Software Foundation. This compiler has evolved since 1988. Originally, it existed as a straight C compiler. Currently, it is one of the most widely used C compilers on Unix. Over time, the compiler has been enhanced to support C++ and Objective-C. The compiler is provided in source form and can be built as either a native compiler, or as a cross-compiler. This compiler would be built to produce MIPS R3000 target files, and can be built to run on both the DECstations and SPARCstations. The output from this compiler can be linked to run on the DECstations and on the target Mongoose processor. This allows early performance measurements to be made on the DECstations without access to Mongoose based hardware. This also makes unit testing and unit integration testing easier.
- GNU gcc C++/Objective-C/C compiler
- USL cfront C++ to C translator used with a TBS C compiler
USL cfront is a C++ to C translator. It was originally developed by AT&T. cfront would be used to convert C++ code to C, which then would be compiled using a TBS compiler. Candidate compilers are:
The current choice is GNU gcc.
- DECstation C compiler
- BSO/Tasking C compiler
The Science Instrument Software will use a remote C++ or C source level debugger in conjunction with a ROM Monitor attached to the DPA hardware. Candidate debuggers include:
Candidate ROM Monitors include:
- GNU gdb Source Code debugger - currently supports, C, C++, Modula-2
- BSO/Tasking C Source Debugger - supports C only
The current choice of source code debugger is GNU gdb and of ROM Monitor is the LSI MIPS ROM Monitor.
- LSI MIPS ROM Monitor
- Embedded Performance MON/LR33000 Symbolic
- BSO/Tasking ROM Monitor
The bulk of the unit testing of DPA Science Instrument Software will take place on Ultrix-based DECstations. Unit test coverage will be maintained using a freely available coverage tool called GCT. On-target unit testing will be performed on items which could not be covered during the workstation-based tests. On-target testing will be performed using a source code debugger and ROM monitor (see Section Section 11.7 on page 28). Coverage of these tests will be determined manually.
Most unit integration testing will take place on the DECstations. Tests which cannot be performed on the DECstations will be performed on target hardware, using a source-level debugger and ROM monitor.
Hardware integration tests will be performed on target hardware using a source-level debugger and ROM monitor.
The use and selection of automated integration testing tools will be made between SWPDR and SWCDR.
The following information was extracted from the FSF support vendor list provided with the source distribution of the GNU C++ compiler, gcc, version 2.5.8.
Name: Cygnus Support
1937 Landings Drive
Mountain View, CA 94043 USA
+1 415 903 1400 voice
+1 415 903 0122 fax
1 Kendall Square Cambridge, MA 02139
+1 617 494 1040
Cygnus Support offers warranty protection (service contracts) for a number of free software tools. For a fixed annual fee our customers receive binary and source distributions, mail and phone support, documentation and customization assistance on a variety of popular platforms.
At the time of this writing we offer support for a development package including (among other things) gcc, g++, gdb, and of course, GNU Emacs. We also offer support for a network facilities package including many of the Athena tools like Kerberos and Hesiod. However the set of supported tools and platforms increases frequently so contact us for the latest information.
For those who need on-site assistance, support is also available from our Cambridge office.
Annual Support starts at $3,000.