Summary

Specify how measurement data are analyzed and communicated.

Description

Specifying analysis procedures in advance ensures that appropriate analyses will be conducted and reported to address documented measurement objectives (and thereby the information needs and objectives on which they are based). This approach also provides a check that necessary data will, in fact, be collected. Analysis procedures should account for the quality (e.g., age, reliability) of all data that enter into an analysis (whether from the project, organization’s measurement repository, or other source). The quality of data should be considered to help select the appropriate analysis procedure and evaluate the results of the analysis.

Example Work Products



  1. Analysis specifications and procedures
  2. Data analysis tools


Subpractices



1. Specify and prioritize the analyses to be conducted and the reports to be prepared.

Early on, pay attention to the analyses to be conducted and to the manner in which results will be reported.

These analyses and reports should meet the following criteria:

  • The analyses explicitly address the documented measurement objectives.
  • Presentation of results is clearly understandable by the audiences to whom the results are addressed.


Priorities may have to be set for available resources.



2. Select appropriate data analysis methods and tools.

 

Issues to be considered typically include the following:
  • Choice of visual display and other presentation techniques (e.g., pie charts, bar charts, histograms, radar charts, line graphs, scatter plots, tables)
  • Choice of appropriate descriptive statistics (e.g., arithmetic mean, median, mode)
  • Decisions about statistical sampling criteria when it is impossible or unnecessary to examine every data element
  • Decisions about how to handle analysis in the presence of missing data elements
  • Selection of appropriate analysis tools


 

Descriptive statistics are typically used in data analysis to do the following:
  • Examine distributions of specified measures (e.g., central tendency, extent of variation, data points exhibiting unusual variation)
  • Examine interrelationships among specified measures (e.g., comparisons of defects by phase of the product’s lifecycle, comparisons of defects by product component)
  • Display changes over time


Refer to the Select Measures and Analytic Techniques specific practice and Monitor the Performance of Selected Subprocesses specific practice in the Quantitative Project Management (QPM) (CMMI-DEV) process area for more information about the appropriate use of statistical techniques and understanding variation.



3. Specify administrative procedures for analyzing data and communicating results.

 

Issues to be considered typically include the following:
  • Identifying the persons and groups responsible for analyzing the data and presenting the results
  • Determining the timeline to analyze the data and present the results
  • Determining the venues for communicating the results (e.g., progress reports, transmittal memos, written reports, staff meetings)



4. Review and update the proposed content and format of specified analyses and reports.

All of the proposed content and format are subject to review and revision, including analytic methods and tools, administrative procedures, and priorities. Relevant stakeholders consulted should include end users, sponsors, data analysts, and data providers.



5. Update measures and measurement objectives as necessary.

Just as measurement needs drive data analysis, clarification of analysis criteria can affect measurement. Specifications for some measures may be refined further based on specifications established for data analysis procedures. Other measures may prove unnecessary or a need for additional measures may be recognized.

Specifying how measures will be analyzed and reported can also suggest the need for refining measurement objectives themselves.



6. Specify criteria for evaluating the utility of analysis results and for evaluating the conduct of measurement and analysis activities.

 

Criteria for evaluating the utility of the analysis might address the extent to which the following apply:
  • The results are provided in a timely manner, understandable, and used for decision making.
  • The work does not cost more to perform than is justified by the benefits it provides.


 

Criteria for evaluating the conduct of the measurement and analysis might include the extent to which the following apply:
  • The amount of missing data or the number of flagged inconsistencies is beyond specified thresholds.
  • There is selection bias in sampling (e.g., only satisfied end users are surveyed to evaluate end-user satisfaction, only unsuccessful projects are evaluated to determine overall productivity).
  • Measurement data are repeatable (e.g., statistically reliable).
  • Statistical assumptions have been satisfied (e.g., about the distribution of data, about appropriate measurement scales).