SlideShare a Scribd company logo
INTRODUCTION:
UNIT-III
LIFE-CYCLE PHASES
- If there is a well defined separation between “research and development” activities and
“production”activities then the software is said to be in successful development process.
- Most of the software’s fail due to the following characteristics ,
1) An overemphasis on research and development.
2) An overemphasis on production.
ENGINEERINGAND PRODUCTIONSTAGES:
To achieve economics of scale and higher return on investment, we must move toward a
software manufacturing process which is determined by technological improvements in process
automation and component based development.
There are two stages in the software development process
1) The engineering stage: Less predictable but smaller teams doing design and production
activities. This stage is decomposed into two distinct phases inception and elaboration.
2) The production stage: More predictable but larger teams doing construction, test, and
deployment activities. This stage is also decomposed into two distinct phases construction and
transition.
These four phases of lifecycle process are loosely mapped to the conceptual framework of the
spiral model is as shown in the following figure.
- In the above figure the size of the spiral corresponds to the inactivity of the project with respect
to the breadth and depth of the artifacts that have been developed.
- This inertia manifests itself in maintaining artifact consistency, regression testing,
documentation, quality analyses, and configuration control.
- Increased inertia may have little, or at least very straightforward, impact on changing any given
discrete component or activity.
- However, the reaction time for accommodating major architectural changes, major requirements
changes, major planning shifts, or major organizational perturbations clearly increases in
subsequent phases.
INCEPTION PAHSE:
The main goal of this phase is to achieve agreement among stakeholders on the life-cycle
objectives for the project.
PRIMARYOBJECTIVES
1) Establishing the project’s scopeand boundaryconditions
2) Distinguishing the critical use cases of the system and the primary
scenarios of operation
3) Demonstrating at least one candidate architecture against some ofthe
primary scenarios
4) Estimating costand schedule for the entire project
5) Estimating potential risks
ESSENTIALACTIVITIES:
1) Formulating the scopeofthe project
2) Synthesizing the architecture
3) Planning and preparing a business case
ELABORATION PHASE
- It is the most critical phase among the four phases.
- Depending upon the scope, size, risk, and fres hnes s of the project, an executable
architecture prototypeis built in one or more iterations.
- At most of the time the process may accommodate changes, the elaboration phase activities
must ensure that the architecture, requirements, and plans are stable. And also the cost and
schedule for the completion of the development can be predicted within an acceptable range.
PRIMARYOBJECTIVES
1) Base lining the architecture as rapidly as practical
2) Base lining the vision
3) Base lining a high-reliability plan for the constructionphase
4) Demonstrating that the baseline architecture will support the vision at a reasonable
costin a reasonable time.
ESSENTIALACTIVITIES
1) Elaborating the vision
2) Elaborating the processand infrastructure
3) Elaborating the architecture and selecting components
CONSTRUCTION PHASE
During this phase all the remaining components and application features are integrated into
the application, and all features are thoroughly tested. Newly developed software is integrated
where ever required.
- If it is a big project then parallel constructionincrements are generated.
PRIMARYOBJECTIVES
1) Minimizing development costs
2) Achieving adequate quality as rapidly as practical
3) Achieving useful version ( alpha, beta, and other releases) as rapidly as
practical
ESSENTIALACTIVITIES
1) Resourcemanagement, control, and processoptimization
2) Complete component development and testing evaluation criteria
3) Assessment of product release criteria ofthe vision
TRANSITION PHASE
Whenever a project is grown-up completely and to be deployed in the end-user domain this
phase is called transition phase. It includes the following activities:
1) Beta testing to validate the new system against user expectations
2) Beta testing and parallel operation relative to a legacy systemit is replacing
3) Conversionof operational databases
4) Training of users and maintainers
PRIMARYOBJECTIVES
1) Achieving user self-supportability
2) Achieving stakeholder concurrence
3) Achieving final product baseline as rapidly and cost-effectively as practical
ESSENTIALACTIVITIES
1) Synchronization and integration of concurrent constructionincrements into consistent
deployment baselines
2) Deployment-specific engineering
3) Assessment of deployment baselines against the complete vision and acceptancecriteria
in the requirement set.
Artifacts of the Process
- Conventional s/w projects focused onthe sequential development ofs/w artifacts:
- Build the requirements
- Construct a design modeltraceable to the requirements &
- Compile and test the implementation fordeployment.
-This process can work for small-scale, purely custom developments in which the design
representation, implementation representation and deployment representation are closely
aligned.
- This approach is doesn't work for most of today’s s/w systems why because of having
complexity and are composed of numerous components some are custom, some reused, some
commercial products.
THE ARTIFACT SETS
In order to manage the development of a complete software system, we need to gather distinct
collections of information and is organized into artifact sets.
- Set represents a complete aspect of the system where as artifact represents interrelated
information that is developed and reviewed as a single entity.
- The artifacts of the process areorganized into five sets:
1) Management 2) Requirements 3) Design
4) Implementation 5) Deployment
here the management artifacts capture the information that is necessaryto synchronize
stakeholder expectations. Where as the remaining four artifacts are captured rigorous notations
that support automated analysis and browsing.
THE MANAGEMENT SET
It captures the artifacts associated with process planning and execution. These artifacts use ad
hoc notation including text, graphics, or whatever representation is required to capture the
“contracts” among,
- project personnel:
projectmanager, architects, developers, testers, marketers,
administrators
- stakeholders:
funding authority, user, s/w project manager, organization manager,
regulatory agency & between project personnel and stakeholders
Management artifacts are evaluated, assessed,and measured through a combination of
1) Relevant stakeholder review.
2) Analysis of changes between the current version of the artifact and previous versions.
3) Major milestone demonstrations ofthe balance among all artifacts.
THE ENGINEERING SETS:
1) REQUIREMENTSET:
- The requirements set is the primary engineering context for evaluating the other three
engineering artifact sets and is the basis for test cases.
- Requirement artifacts are evaluated, assessed, andmeasuredthrough a combinationof
1) Analysis of consistencywith the release specifications ofthe mgmt set.
2) Analysis of consistencybetween the vision and the requirement models.
3) Mapping against the design, implementation, and deployment sets to
evaluate the consistency and completeness and the semantic balance between
information in the different sets.
4) Analysis of changes between the current version ofthe artifacts and previous versions.
5) Subjective review of other dimensions of quality.
2) DESIGN SET:
- UML notations are used to engineer the design models for the solution.
- It contains various levels of abstraction and enough structural and behavioral information to
determine a bill of materials.
- Design model information can be clearly and, in many cases, automatically translated into a
subset of the implementation and deployment set artifacts.
The designset is evaluated, assessed, and measuredthrough a combinationof
1) Analysis of the internal consistencyand quality of the design model.
2) Analysis of consistencywith the requirements models.
3) Translation into implementation and deployment sets and notations to evaluate the
consistencyand completeness and semantic balance between information in the sets.
4) Analysis of changes between the current version of the design model and previous
versions.
5) Subjective review of other dimensions of quality.
3) IMPLEMENTATION SET:
- The implementation set include source code that represents the tangible implementations of
components and any executables necessaryfor stand-alone testing of components.
- Executables are the primitive parts that are needed to construct the end product, including
customcomponents, APIs ofcommercial components.
- Implementation set artifacts can also be translated into a subset of the deployment set.
Implementation sets are human-readable formats that are evaluated, assessed, and measured
through a combination of
1) Analysis of consistencywith design models
2) Translation into deployment set notations to evaluate consistencyand completeness
among artifact sets.
3) Execution of stand-alone component test cases that automatically compare expected
results with actual results.
4) Analysis of changes b/w the current version of the implementation set and previous
versions.
5) Subjective review of other dimensions of quality.
4) DEPLOYMENTSET:
- It includes user deliverables and machine language notations, executable software, and the
build scripts, installation scripts, and executable target-specific data necessary to use the
product in its target environment.
Deploymentsets are evaluated, assessed, andmeasuredthrough a combinationof
1) Testing against the usage scenarios and quality attributes defined in the requirements
set to evaluate the consistencyand completeness and the semantic balance between
information in the two sets.
2) Testing the partitioning, replication, and allocation strategies in mapping components
of the implementation set to physical resources of the deployment system.
3) Testing against the defined usage scenarios in the user manual suchas installation,
user-oriented dynamic reconfiguration, mainstream usage, and anomaly management.
4) Analysis of changes b/w the current version of the deployment set and previous
versions.
5) Subjective review of other dimensions of quality.
Eachartifactsetuses different notations to capture the relevant artifact.
1) Management set notations (ad hoc text, graphics, use case notation) capture the plans,
process,objectives, and acceptancecriteria.
2) Requirement notation (structured text and UML models) capture the engineering context
and the operational concept.
3) Implementation notations (software languages) capture the building blocks of the solution
in human-readable formats.
4) Deploymentnotations (executables and data files) capture the solution in machine-readable
formats.
IMPLEMENTATION SET VERSUS DEPLOYMENTSET
- The structure of the information delivered to the user (testing people) is very different from
the structure of the sourcecodeimplementation.
- Engineering decisions that have impact on the quality of the deployment set but are relatively
incomprehensible in the design and implementation sets include:
1) Dynamically reconfigurable parameters such as buffer sizes, color palettes, number of
servers, number of simultaneous clients, data files, run-time parameters.
2) Effects of compiler/link optimizations such as space optimization versus speed
optimization.
3) Performance under certain allocation strategies such as centralized versus distributed,
primary and shadowthreads, dynamic load balancing.
4) Virtual machine constraints such as file descriptors, garbage collection, heap size,
maximum record size, disk file rotations.
5) Process-levelconcurrencyissues suchas deadlock and race condition.
6) Platform-specific differences in performance or behavior.
ARTIFACTS EVOLUTION OVER THE LIFE CYCLE
- Each state of development represents a certain amount of precision in the final system
description.
- Early in the lifecycle, precision is low and the representation is generally high. Eventually,
the precision of representation is high and everything is specified in full detail.
- At any point in the lifecycle, the five sets will be in different states of completeness.
However, they should be at compatible levels of detail and reasonably traceable to one another.
- Performing detailed traceability and consistency analyses early in the life cycle i.e. when
precision is low and changes are frequent usually has a low ROI.
Inception phase: It mainly focuses on critical requirements, usually with a secondary focus on
an initial deployment view, little implementation and high-level focus on the design
architecture but not on design detail.
Elaboration phase: It include generation of an executable prototype, involves subsets of
development in all four sets. A portion of all four sets must be evolved to some level of
completion before an architecture baseline can be established.
Fig:Life-Cycle evolutionof the artifactsets
Construction: Its main focus on design and implementation. In the early stages the main focus
is on the depth of the design artifacts. Later, in construction, realizing the design in source code
and individually tested components. This stage should drive the requirements, design, and
implementation sets almost to completion. Substantial work is also done on the deployment
set, at least to test one or a few instances of the programmed system through alpha or beta
releases.
Transition: The main focus is on achieving consistency and completeness of the deployment
set in the context of another set. Residual defects are resolved, and feedback from alpha, beta,
and system testing is incorporated.
TEST ARTIFACTS:
Testing refers to the explicit evaluation through execution of deployment set
componentsundera controlled scenario with an expected and objective outcome.
- What ever the document-driven approach that was applied to software development is also
followed by the software testing people.
- Development teams built requirements documents, top-level design documents, and detailed
design documents before constructing any source files or executable files.
- In the same way test teams built system test plan documents, unit test plan documents, and
unit test procedure documents before building any test drivers, stubs, or instrumentation.
- This document-driven approach caused the same problems for the test activities that it did for
the development activities.
- One of the truly tasteful belief of a modern process is to use exactly the same sets, notations,
and artifacts for the products of test activities as are used for product development.
- The test artifacts must be developed concurrently with the product from inception through
deployment. i.e. Testing a full-life-cycle activity, not a late life-cycle activity.
- The test artifacts are communicated, engineered, and developed within the same artifact sets
as the developed product.
- The test artifacts are implemented in programmable and repeatable formats as software
programs.
- The test artifacts are documented in the same way that the product is documented.
- Developers of the test artifacts use the same tools, techniques, and training as the software
engineers developing the product.
- Testing is only one aspect of the evaluation workflow. Other aspects include inspection,
analysis, and demonstration.
- The success of test can be determined by comparing the expected outcome to the actual
outcome with well-defined mathematical precision.
MANAGEMENTARTIFACTS:
• Development of WBS is dependent on product management style , organizational culture,
customperformance, financial constraints and several project specific parameters.
• The WBS is the architecture of project plan. It encapsulate change and evolve with
appropriate level of details.
• A WBS is simply a hierarchy of elements that decomposes the project plan into discrete
work task.
• A WBS provides the following information structure
• - A delineation of all significant tasks.
• - A clear task decomposition for assignment of responsibilities.
• - A framework for scheduling ,debugging and expenditure tracking.
• -Most systems have first level decomposition subsystem. subsystems are then
decomposed into their components
• Therefore WBS is a driving vehicle for budgeting and collecting cost.
• The structure of costaccountability is a serious project planning constraints.
Business case:
Spm unit 3
Spm unit 3
Spm unit 3
Spm unit 3
Spm unit 3
:
• Managing change is one of the fundamental primitives of an iterative development process.
• This flexibility increases the content, quality, and number of iterations that a project can
achieve within a given schedule.
• Once software is placed in a controlled baseline, all changes must be formally tracked and
managed.
• Most of the change management activities can be automated by automating data entry and
maintaining change records online.
Spm unit 3
Spm unit 3
Spm unit 3
Spm unit 3
Spm unit 3
Spm unit 3

More Related Content

Spm unit 3

  • 1. INTRODUCTION: UNIT-III LIFE-CYCLE PHASES - If there is a well defined separation between “research and development” activities and “production”activities then the software is said to be in successful development process. - Most of the software’s fail due to the following characteristics , 1) An overemphasis on research and development. 2) An overemphasis on production. ENGINEERINGAND PRODUCTIONSTAGES: To achieve economics of scale and higher return on investment, we must move toward a software manufacturing process which is determined by technological improvements in process automation and component based development. There are two stages in the software development process 1) The engineering stage: Less predictable but smaller teams doing design and production activities. This stage is decomposed into two distinct phases inception and elaboration. 2) The production stage: More predictable but larger teams doing construction, test, and deployment activities. This stage is also decomposed into two distinct phases construction and transition.
  • 2. These four phases of lifecycle process are loosely mapped to the conceptual framework of the spiral model is as shown in the following figure. - In the above figure the size of the spiral corresponds to the inactivity of the project with respect to the breadth and depth of the artifacts that have been developed. - This inertia manifests itself in maintaining artifact consistency, regression testing, documentation, quality analyses, and configuration control. - Increased inertia may have little, or at least very straightforward, impact on changing any given discrete component or activity. - However, the reaction time for accommodating major architectural changes, major requirements changes, major planning shifts, or major organizational perturbations clearly increases in subsequent phases. INCEPTION PAHSE: The main goal of this phase is to achieve agreement among stakeholders on the life-cycle objectives for the project. PRIMARYOBJECTIVES 1) Establishing the project’s scopeand boundaryconditions 2) Distinguishing the critical use cases of the system and the primary scenarios of operation 3) Demonstrating at least one candidate architecture against some ofthe primary scenarios 4) Estimating costand schedule for the entire project 5) Estimating potential risks
  • 3. ESSENTIALACTIVITIES: 1) Formulating the scopeofthe project 2) Synthesizing the architecture 3) Planning and preparing a business case ELABORATION PHASE - It is the most critical phase among the four phases. - Depending upon the scope, size, risk, and fres hnes s of the project, an executable architecture prototypeis built in one or more iterations. - At most of the time the process may accommodate changes, the elaboration phase activities must ensure that the architecture, requirements, and plans are stable. And also the cost and schedule for the completion of the development can be predicted within an acceptable range. PRIMARYOBJECTIVES 1) Base lining the architecture as rapidly as practical 2) Base lining the vision 3) Base lining a high-reliability plan for the constructionphase 4) Demonstrating that the baseline architecture will support the vision at a reasonable costin a reasonable time. ESSENTIALACTIVITIES 1) Elaborating the vision 2) Elaborating the processand infrastructure 3) Elaborating the architecture and selecting components CONSTRUCTION PHASE During this phase all the remaining components and application features are integrated into the application, and all features are thoroughly tested. Newly developed software is integrated where ever required. - If it is a big project then parallel constructionincrements are generated. PRIMARYOBJECTIVES 1) Minimizing development costs 2) Achieving adequate quality as rapidly as practical 3) Achieving useful version ( alpha, beta, and other releases) as rapidly as practical ESSENTIALACTIVITIES 1) Resourcemanagement, control, and processoptimization 2) Complete component development and testing evaluation criteria
  • 4. 3) Assessment of product release criteria ofthe vision TRANSITION PHASE Whenever a project is grown-up completely and to be deployed in the end-user domain this phase is called transition phase. It includes the following activities: 1) Beta testing to validate the new system against user expectations 2) Beta testing and parallel operation relative to a legacy systemit is replacing 3) Conversionof operational databases 4) Training of users and maintainers PRIMARYOBJECTIVES 1) Achieving user self-supportability 2) Achieving stakeholder concurrence 3) Achieving final product baseline as rapidly and cost-effectively as practical ESSENTIALACTIVITIES 1) Synchronization and integration of concurrent constructionincrements into consistent deployment baselines 2) Deployment-specific engineering 3) Assessment of deployment baselines against the complete vision and acceptancecriteria in the requirement set. Artifacts of the Process - Conventional s/w projects focused onthe sequential development ofs/w artifacts: - Build the requirements - Construct a design modeltraceable to the requirements & - Compile and test the implementation fordeployment. -This process can work for small-scale, purely custom developments in which the design representation, implementation representation and deployment representation are closely aligned. - This approach is doesn't work for most of today’s s/w systems why because of having complexity and are composed of numerous components some are custom, some reused, some commercial products. THE ARTIFACT SETS In order to manage the development of a complete software system, we need to gather distinct collections of information and is organized into artifact sets.
  • 5. - Set represents a complete aspect of the system where as artifact represents interrelated information that is developed and reviewed as a single entity. - The artifacts of the process areorganized into five sets: 1) Management 2) Requirements 3) Design 4) Implementation 5) Deployment here the management artifacts capture the information that is necessaryto synchronize stakeholder expectations. Where as the remaining four artifacts are captured rigorous notations that support automated analysis and browsing. THE MANAGEMENT SET It captures the artifacts associated with process planning and execution. These artifacts use ad hoc notation including text, graphics, or whatever representation is required to capture the “contracts” among, - project personnel: projectmanager, architects, developers, testers, marketers, administrators - stakeholders: funding authority, user, s/w project manager, organization manager, regulatory agency & between project personnel and stakeholders
  • 6. Management artifacts are evaluated, assessed,and measured through a combination of 1) Relevant stakeholder review. 2) Analysis of changes between the current version of the artifact and previous versions. 3) Major milestone demonstrations ofthe balance among all artifacts. THE ENGINEERING SETS: 1) REQUIREMENTSET: - The requirements set is the primary engineering context for evaluating the other three engineering artifact sets and is the basis for test cases. - Requirement artifacts are evaluated, assessed, andmeasuredthrough a combinationof 1) Analysis of consistencywith the release specifications ofthe mgmt set. 2) Analysis of consistencybetween the vision and the requirement models. 3) Mapping against the design, implementation, and deployment sets to evaluate the consistency and completeness and the semantic balance between information in the different sets. 4) Analysis of changes between the current version ofthe artifacts and previous versions. 5) Subjective review of other dimensions of quality. 2) DESIGN SET: - UML notations are used to engineer the design models for the solution. - It contains various levels of abstraction and enough structural and behavioral information to determine a bill of materials. - Design model information can be clearly and, in many cases, automatically translated into a subset of the implementation and deployment set artifacts. The designset is evaluated, assessed, and measuredthrough a combinationof 1) Analysis of the internal consistencyand quality of the design model. 2) Analysis of consistencywith the requirements models. 3) Translation into implementation and deployment sets and notations to evaluate the consistencyand completeness and semantic balance between information in the sets. 4) Analysis of changes between the current version of the design model and previous versions. 5) Subjective review of other dimensions of quality. 3) IMPLEMENTATION SET:
  • 7. - The implementation set include source code that represents the tangible implementations of components and any executables necessaryfor stand-alone testing of components. - Executables are the primitive parts that are needed to construct the end product, including customcomponents, APIs ofcommercial components. - Implementation set artifacts can also be translated into a subset of the deployment set. Implementation sets are human-readable formats that are evaluated, assessed, and measured through a combination of 1) Analysis of consistencywith design models 2) Translation into deployment set notations to evaluate consistencyand completeness among artifact sets. 3) Execution of stand-alone component test cases that automatically compare expected results with actual results. 4) Analysis of changes b/w the current version of the implementation set and previous versions. 5) Subjective review of other dimensions of quality. 4) DEPLOYMENTSET: - It includes user deliverables and machine language notations, executable software, and the build scripts, installation scripts, and executable target-specific data necessary to use the product in its target environment. Deploymentsets are evaluated, assessed, andmeasuredthrough a combinationof 1) Testing against the usage scenarios and quality attributes defined in the requirements set to evaluate the consistencyand completeness and the semantic balance between information in the two sets. 2) Testing the partitioning, replication, and allocation strategies in mapping components of the implementation set to physical resources of the deployment system. 3) Testing against the defined usage scenarios in the user manual suchas installation, user-oriented dynamic reconfiguration, mainstream usage, and anomaly management. 4) Analysis of changes b/w the current version of the deployment set and previous versions. 5) Subjective review of other dimensions of quality. Eachartifactsetuses different notations to capture the relevant artifact. 1) Management set notations (ad hoc text, graphics, use case notation) capture the plans, process,objectives, and acceptancecriteria.
  • 8. 2) Requirement notation (structured text and UML models) capture the engineering context and the operational concept. 3) Implementation notations (software languages) capture the building blocks of the solution in human-readable formats. 4) Deploymentnotations (executables and data files) capture the solution in machine-readable formats. IMPLEMENTATION SET VERSUS DEPLOYMENTSET - The structure of the information delivered to the user (testing people) is very different from the structure of the sourcecodeimplementation. - Engineering decisions that have impact on the quality of the deployment set but are relatively incomprehensible in the design and implementation sets include: 1) Dynamically reconfigurable parameters such as buffer sizes, color palettes, number of servers, number of simultaneous clients, data files, run-time parameters. 2) Effects of compiler/link optimizations such as space optimization versus speed optimization. 3) Performance under certain allocation strategies such as centralized versus distributed, primary and shadowthreads, dynamic load balancing. 4) Virtual machine constraints such as file descriptors, garbage collection, heap size, maximum record size, disk file rotations. 5) Process-levelconcurrencyissues suchas deadlock and race condition.
  • 9. 6) Platform-specific differences in performance or behavior. ARTIFACTS EVOLUTION OVER THE LIFE CYCLE - Each state of development represents a certain amount of precision in the final system description. - Early in the lifecycle, precision is low and the representation is generally high. Eventually, the precision of representation is high and everything is specified in full detail. - At any point in the lifecycle, the five sets will be in different states of completeness. However, they should be at compatible levels of detail and reasonably traceable to one another. - Performing detailed traceability and consistency analyses early in the life cycle i.e. when precision is low and changes are frequent usually has a low ROI. Inception phase: It mainly focuses on critical requirements, usually with a secondary focus on an initial deployment view, little implementation and high-level focus on the design architecture but not on design detail. Elaboration phase: It include generation of an executable prototype, involves subsets of development in all four sets. A portion of all four sets must be evolved to some level of completion before an architecture baseline can be established. Fig:Life-Cycle evolutionof the artifactsets Construction: Its main focus on design and implementation. In the early stages the main focus is on the depth of the design artifacts. Later, in construction, realizing the design in source code and individually tested components. This stage should drive the requirements, design, and implementation sets almost to completion. Substantial work is also done on the deployment
  • 10. set, at least to test one or a few instances of the programmed system through alpha or beta releases. Transition: The main focus is on achieving consistency and completeness of the deployment set in the context of another set. Residual defects are resolved, and feedback from alpha, beta, and system testing is incorporated. TEST ARTIFACTS: Testing refers to the explicit evaluation through execution of deployment set componentsundera controlled scenario with an expected and objective outcome. - What ever the document-driven approach that was applied to software development is also followed by the software testing people. - Development teams built requirements documents, top-level design documents, and detailed design documents before constructing any source files or executable files. - In the same way test teams built system test plan documents, unit test plan documents, and unit test procedure documents before building any test drivers, stubs, or instrumentation. - This document-driven approach caused the same problems for the test activities that it did for the development activities. - One of the truly tasteful belief of a modern process is to use exactly the same sets, notations, and artifacts for the products of test activities as are used for product development. - The test artifacts must be developed concurrently with the product from inception through deployment. i.e. Testing a full-life-cycle activity, not a late life-cycle activity. - The test artifacts are communicated, engineered, and developed within the same artifact sets as the developed product. - The test artifacts are implemented in programmable and repeatable formats as software programs. - The test artifacts are documented in the same way that the product is documented. - Developers of the test artifacts use the same tools, techniques, and training as the software engineers developing the product. - Testing is only one aspect of the evaluation workflow. Other aspects include inspection, analysis, and demonstration. - The success of test can be determined by comparing the expected outcome to the actual outcome with well-defined mathematical precision. MANAGEMENTARTIFACTS:
  • 11. • Development of WBS is dependent on product management style , organizational culture, customperformance, financial constraints and several project specific parameters. • The WBS is the architecture of project plan. It encapsulate change and evolve with appropriate level of details. • A WBS is simply a hierarchy of elements that decomposes the project plan into discrete work task. • A WBS provides the following information structure • - A delineation of all significant tasks. • - A clear task decomposition for assignment of responsibilities. • - A framework for scheduling ,debugging and expenditure tracking. • -Most systems have first level decomposition subsystem. subsystems are then decomposed into their components • Therefore WBS is a driving vehicle for budgeting and collecting cost. • The structure of costaccountability is a serious project planning constraints. Business case:
  • 17. : • Managing change is one of the fundamental primitives of an iterative development process. • This flexibility increases the content, quality, and number of iterations that a project can achieve within a given schedule. • Once software is placed in a controlled baseline, all changes must be formally tracked and managed. • Most of the change management activities can be automated by automating data entry and maintaining change records online.