SlideShare a Scribd company logo
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Monitor
          Operate
          Working software
          in production
          Value realization
Monitor
          Operate
          Working software
          in production
          Value realization
Monitor
          Operate
          Working software
          in production
          Value realization
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Operational Stores
Warehouse Adapters
Warehouse Relational
Database
Analysis Services Cube
Report Designer Reports
Excel Reports
Team Foundation Server 2012 Reporting
Excel Reports            Excel Report
Project Management       Generation
Bug Backlog Management   Create reports in Excel from work
                         item queries
Build Management
Test Management
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Quickly find important information about team projects
Show project data, support investigation, & help teams perform common tasks more quickly.
Windows SharePoint   SharePoint Server   SharePoint Server
              Services 3.0          Standard           Enterprise
My
                   X                   X                   X
Project
                                       X                   X
Progress
                                                           X
Quality
                                                           X
Test
                                                           X
Bugs
                                                           X
Build
                                                           X
What is the next set of Tasks, Bugs, or Test Cases that I should act on?
What is the status of the team's most recent builds?
Is the team likely to finish the iteration on time?
Will the team complete the planned work based on the current
burn rate?
What were the most recent check-ins?


                                                                               Burn Rate




                                                               Work Item Breakdown

                           Burndown
Is the team likely to finish the iteration on time?
Will the team complete the planned work based on the current
burndown?
How much progress has the team made on implementing user
stories in the past four weeks?
How quickly is the team identifying and closing Issues?
What were the most recent check-ins?
Is the test effort on track?
Is the team testing the appropriate functionality?
Are the team's bug fixes of high quality?
Are tests stale?
Does the team have sufficient tests?
Are any bottlenecks occurring?
Is the authoring of Test Cases on track?
Has the team defined Test Cases for all User Stories?
What are the proportions of Test Cases that are passing, failing, and blocked?
Do test failure metrics indicate a problem that requires further investigation?
What is the status of last night's build?
What are the most recent check-ins?
How quickly is the team resolving and closing bugs?
Is the team fixing bugs quickly enough to finish on time?
How many bugs is the team reporting, resolving, and closing per day?
Is the team resolving priority 1 bugs before priority 2 and 3 bugs?
Does any team member have a backlog of priority 1 bugs that warrant
redistribution?
How volatile is the code base?
How much of the code is the team testing?
How high is the quality of the builds?
Is the quality increasing, decreasing, or staying
constant?
Which builds succeeded?
Which builds have a significant number of changes
to the code?
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
# Hours of Items
  of Work Work
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Fact/Fact Table
Attribute
Dimension
KPIs
Measure
Measure Group
Perspective
Relational DB

                                            OLAP Cube

                            Work item query results


Excel


Project or Project Server


Report Builder

Report Designer
Use Familiar Tools   Get the Visibility You
Excel                Want
SharePoint           Many included reports
Reporting Services   Customize or create your own
Others
Thank You
Steve Lange
sr. developer technology specialist

stevenl@microsoft.com
@stevelange | http://slange.me
Team Foundation Server 2012 Reporting
Team Foundation Server 2012 Reporting
Figure 1:
Magic
Quadrant for
Application
Lifecycle
Management

More Related Content

Team Foundation Server 2012 Reporting

  • 4. Monitor Operate Working software in production Value realization
  • 5. Monitor Operate Working software in production Value realization
  • 6. Monitor Operate Working software in production Value realization
  • 17. Excel Reports Excel Report Project Management Generation Bug Backlog Management Create reports in Excel from work item queries Build Management Test Management
  • 25. Quickly find important information about team projects Show project data, support investigation, & help teams perform common tasks more quickly.
  • 26. Windows SharePoint SharePoint Server SharePoint Server Services 3.0 Standard Enterprise My X X X Project X X Progress X Quality X Test X Bugs X Build X
  • 27. What is the next set of Tasks, Bugs, or Test Cases that I should act on? What is the status of the team's most recent builds?
  • 28. Is the team likely to finish the iteration on time? Will the team complete the planned work based on the current burn rate? What were the most recent check-ins? Burn Rate Work Item Breakdown Burndown
  • 29. Is the team likely to finish the iteration on time? Will the team complete the planned work based on the current burndown? How much progress has the team made on implementing user stories in the past four weeks? How quickly is the team identifying and closing Issues? What were the most recent check-ins?
  • 30. Is the test effort on track? Is the team testing the appropriate functionality? Are the team's bug fixes of high quality? Are tests stale? Does the team have sufficient tests? Are any bottlenecks occurring?
  • 31. Is the authoring of Test Cases on track? Has the team defined Test Cases for all User Stories? What are the proportions of Test Cases that are passing, failing, and blocked? Do test failure metrics indicate a problem that requires further investigation? What is the status of last night's build? What are the most recent check-ins?
  • 32. How quickly is the team resolving and closing bugs? Is the team fixing bugs quickly enough to finish on time? How many bugs is the team reporting, resolving, and closing per day? Is the team resolving priority 1 bugs before priority 2 and 3 bugs? Does any team member have a backlog of priority 1 bugs that warrant redistribution?
  • 33. How volatile is the code base? How much of the code is the team testing? How high is the quality of the builds? Is the quality increasing, decreasing, or staying constant? Which builds succeeded? Which builds have a significant number of changes to the code?
  • 42. # Hours of Items of Work Work
  • 58. KPIs
  • 62. Relational DB OLAP Cube Work item query results Excel Project or Project Server Report Builder Report Designer
  • 63. Use Familiar Tools Get the Visibility You Excel Want SharePoint Many included reports Reporting Services Customize or create your own Others
  • 65. Steve Lange sr. developer technology specialist stevenl@microsoft.com @stevelange | http://slange.me

Editor's Notes

  1. Much of the content here is derived from published documentation on TFS reporting in the MSDN Library: http://msdn.microsoft.com/en-us/library/bb649552.aspx
  2. http://msdn.microsoft.com/en-us/library/ms244687.aspx
  3. Each tool or plug-in in Team Foundation uses a relational database in SQL Server 2008 to store the data used by the tool in its day-to-day operations. This relational database is often referred to as the operational store. The operational stores for Team Foundation include:Common structure databases (Tfs_Configuration)Team project collection databases (Tfs_Collection) You might also have operational stores created for third-party tools.Like most operational stores, the schema of the relational database is designed and optimized for the online transactional processing of data. As the tool or plug-in performs an activity, it writes the latest information to the operational store. Therefore, data in the operational store is constantly changing and being updated, and all data is current.
  4. Because each tool or plug-in has its own schema requirements and data is stored in the operational store to optimize transactional processing, the purpose of the warehouse adapter is to put the operational data into a form usable by the data warehouse. The warehouse adapter is a managed assembly that extracts the data from the operational store, transforms the data to a standardized format compatible with the warehouse, and writes the transformed data into the warehouse relational database. There is a separate adapter for each operational data store. The warehouse adapter copies and transforms those data fields specified in either the basic warehouse configuration or in the process template used at the time a new team project is created. If you subsequently change the process template to add or delete which data fields are written to the data warehouse, these changes are detected the next time the adapter is run. The adapter runs periodically with a frequency set by the RunIntervalSeconds property. The default setting for the refresh frequency is two hours (7,200 seconds), so give careful consideration to the appropriate refresh frequency for your installation. For more information about changing the refresh frequency, see How to: Change the Refresh Frequency.It is important that data is not written from the relational database to the data cube while the relational database is itself being updated from the operational store. To avoid conflicts reading and writing data, the warehouse adapters that push and pull the data are synchronized. After the adapters have completed their calls, the cube is reprocessed.
  5. Each tool describes its contribution to the data warehouse in an XML schema. The schema specifies the fields that are written to the relational database as dimensions, measures, and details. The schema is also mapped directly into the cube.The data in the warehouse are stored in a set of tables organized in a star schema. The central table of the star schema is called the fact table, and the related tables represent dimensions. Dimensions provide the means for disaggregating reports into smaller parts. A row in a fact table usually contains either the value of a measure or a foreign key reference to a dimension table. The row represents the current state of every item covered by the fact table. For example, the Work Item fact table has one row for every work item stored in Work Item operational store.A dimension table stores the set of values that exist for a given dimension. Dimensions may be shared between different fact tables and cubes, and they may be referenced by a single fact table or data cube. A Person dimension, for example, will be referenced by the Work Items fact table for Assigned To, Opened By, Resolved By, and Closed By properties, and it will be referenced by the Code Churn fact table for the Checked In By property.Measures are values taken from the operational data. For example, Total Churn is a measure that indicates the number of source code changes in the selected changesets. Count is a special measure in that it can be implicit, as long as there is one record for every item that is counted. The measures defined in a fact table form a measure group in the cube.For more information about the facts, dimensions, and measures in the data warehouse, see Perspectives and Measure Groups Provided in the Analysis Services Cube for Team System.
  6. Fact tables are a good source of information for reports that show the current state of affairs. However, to report on trends for data that changes over time, you need to duplicate the same data for each of the time increments that you want to report on. For example, to report on daily trends for work items or test results, the warehouse needs to retain the state of every item for each day. This allows the data cube to aggregate the measures by day. The cube aggregates both data from the underlying star schema and time data into multidimensional structures.Each time the data cube is processed, the data stored in the star schemas in the relational database are pulled into the cube, aggregated, and stored. The data in the cube is aggregated so that high-level reports, which would otherwise require complex processing using the star schema, are simple select statements. The cube provides a central place to obtain data for reports without having to know the schema for each operational store and without having to access each store separately.
  7. Report Designer is a component of Visual Studio that allows you to define the Team Foundation data warehouse as a data source and then design a report interactively. Report Designer provides tabbed windows for Data, Layout, and Preview, and you can add datasets to accommodate a new report design idea, or adjust report layout based on preview results. In addition to the Data, Layout, and Preview design surfaces, Report Designer provides query builders, an Expression editor, and wizards to help you place images or step you through the process of creating a simple report. For more information about using Report Designer, see Create, Customize, and Manage Reports for Visual Studio ALM.
  8. Team Foundation integrates with Microsoft Excel to allow you to use Microsoft Excel to manage your project and produce reports. Microsoft Excel provides pivot tables and charts for viewing and analyzing multi-dimensional data. You can bind these pivot tables directly to the Team Foundation cube, so you can interact with the data in the cube. For more information about using Microsoft Excel for reporting, see Create and Manage Excel Reports for Visual Studio ALM.
  9. You can generate several reports in Microsoft Excel that show current status and historical data based on the filter criteria that you specify in a flat-list work item query. This is useful to show the distribution of work items according to selected criteria or to view trends for the past several weeks. In addition, it is an effective way for you to quickly generate PivotTable and PivotChart reports that you can customize to support other report views. When you create an Excel report from a query, you can choose which reports to generate based on the variables that are used to filter the query and the criteria that you select. By using these methods, you can generate the following types of reports:Current reports: Pie charts that show the count of work items according to the filter criteria that are specified in the work item query. Trend reports: Line charts that show the distribution of work items over the past six weeks according to the filter criteria that are specified in the work item query. After the reports are generated, you can easily change the date range.Each report includes several worksheets, and each worksheet shows a PivotTable report and a PivotChart report that derives data from the SQL Server Analysis Services cube.
  10. Team members can use the Bugs dashboard to determine whether they are managing the list of active Bugs according to established team goals and agile practices. By unit testing each increment of code before check-in, the team can reduce the overall number of bugs that the team must find. A team that focuses on being able to ship each increment of code removes defects incrementally and minimizes ongoing bugs.By using the Bugs dashboard, the team can answer the following questions:Is the number of active Bugs acceptable based on team goals? Is the team postponing too many Bugs?Is the team finding, fixing, and closing Bugs quickly enough to meet expectations and at a rate that matches previous development cycles? Is the team addressing high priority bugs before lower priority bugs?Does any team member need help in resolving bugs?
  11. You can use the Code Coverage and Code Churn reports to answer the questions that are listed in the following table. Which builds succeeded? Which builds have a significant number of changes to the code? How often are builds succeeding?How volatile is the code base?How much of the code is the team testing?How high is the quality of the builds? Is the quality increasing, decreasing, or staying constant?
  12. After the team has started to find and fix bugs, you can track the team's progress toward resolving and closing bugs by viewing the Bug Status report. This report shows the cumulative bug count based on the bug state, priority, and severity.
  13. You can use the Bug Trends report to help track the rate at which your team is discovering and resolving bugs. This report shows a rolling or moving average of bugs being reported, resolved, and closed over time. When you manage a large team or a large number of bugs, you can monitor the Bug Trends report weekly to gain insight into how well the team is finding, resolving, and closing bugs.The Bug Trends report calculates a rolling average of the number of bugs that the team has opened, resolved, and closed based on the filters that you specify. The rolling average is based on the seven days before the date for which it is calculated. That is, the report averages the number of bugs in each state for each of the seven days before the date, and then the result is divided by seven.
  14. As the team resolves and closes bugs, you can use the Reactivations report to determine how effectively the team is fixing bugs. Reactivations generally refer to bugs that have been resolved or closed prematurely and then reopened. The reactivation rate is also referred to as the fault feedback ratio.You can use the Reactivations report to show either bugs or user stories that have been reactivated. As a product owner, you might want to discuss acceptable rates of reactivation with the team. A low rate of reactivations (for example, less than 5%) might be acceptable depending on your team's goals. However, a high or increasing rate of reactivations indicates that the team might need to diagnose and fix systemic issues. The Reactivations report shows an area graph of the number of bugs or stories that are in a resolved state or that have been reactivated from the closed state.
  15. The Build Quality Indicators report shows test coverage, code churn, and bug counts for a specified build definition. You can use this report to help determine how close portions of the code are to release quality. Ideally, test rates, bugs, and code churn would all produce the same picture, but they often do not. When you find a discrepancy, you can use the Bug Quality Indicators report to examine the details of a specific build and data series. Because this report combines test results, code coverage from testing, code churn, and bugs, you can view many perspectives at the same time.
  16. The Build Success Over Time report provides a pictorial version of the Build Summary report. The Build Success Over Time report displays the status of the last build for each build category run for each day. You can use this report to help track the quality of the code that the team is checking in. In addition, for any day on which a build ran, you can view the Build Summary for that day.
  17. The Build Summary lists builds and provides information about test results, test coverage, code churn, and quality notes for each build.The data that appears in the Build Summary report is derived from the data warehouse. The report presents a visual display of the percentage of tests that are passing, code that is being tested, and changes in code across several builds. You can review the results for both manual and automatic builds, in addition to the most recent builds and continuous or frequent builds. The report lists the most recent builds first and contains build results that were captured during the specified time interval for all builds that were run, subject to the filters that you specified for the report.At a glance, you can determine the success or failure of several build definitions for the time period under review.
  18. After a team has worked on one or more iterations, also known as sprints, you can determine the rate of team progress by reviewing the Burndown and Burn Rate report. Burndown shows the trend of completed and remaining work over a specified time period. Burn rate provides calculations of the completed and required rate of work based on the specified time period. In addition, a chart shows the amount of completed and remaining work that is assigned to team members. You can view the Burndown and Burn Rate report based on hours worked or number of work items that have been resolved and closed.
  19. After the team has estimated its tasks and begun work, you can use the Remaining Work report to track the team's progress and identify any problems in the flow of work. The Remaining Work report summarizes the data that was captured during the specified time interval for each task, user story, or bug based on the filter criteria that were specified for the report. The data is derived from the data warehouse.You can view this report in either the Hours of Work view or the Number of Work Items view. The first view displays the total number of hours of work for the specified time period and the team's progress toward completing that work. The second view displays the number of work items for the specified time period and the number of work items in each state. Each view provides an area graph that charts the progress of completed work against the total estimated work for the specified time duration.
  20. After work has progressed on several iterations, also known as sprints, you can view the team progress by viewing the Status on All Iterations report. This report helps you track the team's performance over successive iterations. For each iteration that is defined for the product areas that you specify, this report displays the following information: Stories Closed: The number of user stories that have been closed. These values are derived from the current values specified for the iteration and the state of each user story.Progress (Hours): A two-bar numeric and visual representation that represents the values for Original Estimate (grey), Completed (green) and Remaining (light blue) based on the rollup of hours that are defined for all tasks. These values are derived from the current values that are specified for the iteration and the hours for each task. Bugs: A numeric value and visual representation for all bugs, grouped by their current states of Active (blue), Resolved (gold) and Closed (green). These values are derived from the current values that are specified for the iteration and the state of each bug.
  21. The Stories Overview report lists all user stories, filtered by area and iteration and in order of importance.Work Progress% Hours Completed: A numeric value and visual representation that shows the percentage of completed work based on the rollup of baseline and completed hours for all tasks that are linked to the user story or its child stories.Hours Remaining: A numeric value for the rollup of all remaining hours for all tasks that are linked to the user story or its child stories.Test StatusTest Points: A numeric value that represents the number of pairings of test cases with test configurations in a specific test suite. For more information about test points, see Reporting on Testing Progress for Test Plans.Test Results: A numeric value and visual representation that shows the percentage of test cases, grouped according to the status of their most recent test run, where the options are Passed (green), Failed (red), or Not Run (black).Bugs: A numeric value and visual representation that shows the number of bugs that are linked to the test case or user story, where the options are Active (blue) and Resolved (gold). If a user story is linked to one or more child stories, the values represent a rollup of all bugs for the user story and its child stories.User Stories that Appear in the ReportThe Stories Overview report lists and highlights user stories according to the following criteria:Stories appear in order of their importance, based on their assigned ranking.Stories appear in bold type when they are in the active or resolved state. Stories appear in normal type when they are in the closed state. Stories appear in gray type when their assigned iteration or area is outside the filtered set, but they have tasks or child stories that are within the filtered set of iterations or product areas.
  22. The Stories Progress report lists all user stories, filtered by product area and iteration in order of importance.This report displays the following information for each user story that appears in the report: Progress (% Completed): Numeric value that represents the percentage of completed work based on the rollup of baseline and completed hours for all tasks that are linked to the user story or its child stories.Hours Completed: A visual representation of the completed hours, displayed as a dark green bar.Recently Completed: A visual representation of those hours completed within the time interval specified for Recent (Calendar) Days, displayed as a light green bar.Hours Remaining: Rollup of all remaining hours for all tasks that are linked to the user story or its child stories.The Stories Progress report lists and highlights user stories according to the following criteria:Stories appear in order of their importance, based on their assigned ranking.Stories appear in bold type when they are in the active or resolved state. Stories appear in normal type when they are in the closed state. Stories appear in gray type when their assigned iteration or area is outside the filtered set but they have tasks or child stories that are within the filtered set of iterations or product areas.
  23. The Requirements Progress report shows the status of completion as determined by the tasks that have been defined to implement the requirement.
  24. The Requirements Overview report presents a snapshot of the work that has been performed for the filtered set of requirements to the current date.
  25. By reviewing a release burndown report, you can understand how quickly your team has delivered backlog items and track how much work the team must still perform to complete a product release. A release burndown graph shows how much work remained at the start of each sprint in a release. The source of the raw data is your product backlog. Each sprint appears along the horizontal axis, and the vertical axis measures the effort that remained when each sprint started. The amount of estimated effort on the vertical axis is in whatever unit that your scrum team has decided to use (for example, story points or hours).
  26. By reviewing a sprint burndown report, you can track how much work remains in a sprint backlog, understand how quickly your team has completed tasks, and predict when your team will achieve the goal or goals of the sprint.A sprint burndown report shows how much work remained at the end of specified intervals during a sprint. The source of the raw data is the sprint backlog. The horizontal axis shows days in a sprint, and the vertical axis measures the amount of work that remains to complete the tasks in the sprint. The work that remains is shown in hours. A sprint burndown graph displays the following pieces of data: The Ideal Trend line indicates an ideal situation in which the team burns down all of the effort that remains at a constant rate by the end of the sprint. The In Progress series shows how many hours remain for tasks that are marked as In Progress in a sprint. The To Do series shows how many hours remain for tasks that are marked as To Do in a sprint. Both the In Progress and the To Do series are drawn based on the actual progress of your team as it completes tasks.
  27. Toward the end of an iteration, you can use the Unplanned Work report to determine how much work was added to the iteration that was not planned at the start of the iteration. You can view the unplanned work as measured by work items added, such as tasks, test cases, user stories, and bugs. Having unplanned work may be acceptable, especially if the team has scheduled a sufficient buffer for handling the load of unplanned work (for example, bugs). On the other hand, the unplanned work may represent a real problem if the team does not have the capacity to meet it and is forced to cut back on the planned work.The Unplanned Work report is useful when the team plans an iteration by identifying all work items that they intend to resolve or close during the course of the iteration. The work items that are assigned to the iteration by the plan completion date of the report are considered planned work. All work items that are added to the iteration after that date are identified as unplanned work.
  28. The Test Case Readiness report provides an area graph that shows how many test cases are in the Design or Ready state over the time period that you specify. By reviewing this data, you can easily determine how quickly the team is designing test cases and making them ready for testing. When you create a test case, it is automatically set to the design state. After the team has reviewed and approved the test case, then a team member should change its state to Ready, which indicates that the test case is ready to be run.
  29. The data that appears in the Test Plan Progress report is derived from the data warehouse and the test results that are generated when tests are run by using Microsoft Test Manager. The report presents an area graph that shows the most recent result of running any test in the specified test plans over time. For more information, see Running Tests.The horizontal axis shows days in a sprint or iteration, and the vertical axis shows test points. A test point is a pairing of a test case with a test configuration in a specific test suite. For more information about test points, see Reporting on Testing Progress for Test Plans.
  30. http://msdn.microsoft.com/en-us/library/ms244710.aspxBy using the SQL Server Analysis Services cube for Visual Studio Team Foundation Server, you can generate reports of aggregated information about the data that is stored in team project collections. You can easily use this data to create PivotTable and PivotChart reports in Office Excel. You can drag the cube elements onto PivotTable or PivotChart reports to formulate questions and retrieve answers quickly. The cube is optimized to answer questions such as "How many bugs were active, resolved, and closed on each day of the project?"
  31. A fact represents data that can be associated with multiple dimensions. This data may also be aggregated. Fact tables hold these values.Each data warehouse includes one or more fact tables. Central to a "star" or "snowflake" schema, a fact table captures the data that measures the team's operations. Fact tables usually contain large numbers of rows, especially when they contain one or more years of history for a large team project. A key characteristic of a fact table is that it contains numerical data (facts) that can be summarized to provide information about the history of the operation of the organization. Each fact table also includes a multipart index that contains, as foreign keys, the primary keys of related dimension tables. The related dimensions contain attributes of the fact records. Fact tables should not contain descriptive information or any data other than the numerical measurement fields and the index fields that relate the facts to corresponding entries in the dimension tables.For a list of the fact tables that are defined for the data warehouse, see Generate Reports Using the Relational Warehouse Database for Visual Studio ALM.
  32. Each attribute is connected to a column in a corresponding dimension table in the data warehouse. Each dimension is associated with a set of attributes and potentially a set of hierarchies. Area and iteration paths are examples of hierarchies. Some work item dimension attributes are also stored as numeric and date filter values. When you use one of these dimension attributes in the rows or columns section, you can use these values to filter the report. For example, you can filter a report to show work items that were created after Oct 1, 2009, by using the value filter "System_CreatedDate is greater than Oct 1, 2009." You can also use the measure values to filter a report. For example, you can filter the report to show only work items that have more than two hours of work remaining by using the value filter "Remaining Work is greater than 2." For more information about value filters, see the following page on the Microsoft Web site: Filter Numbers in the Values Area.
  33. Dimensions enable you to extract different views of data. Data values are associated with a set of dimensions that allow you to show aggregate results that are filtered using a specific set of dimension values.You can use dimensions to disaggregate the data and show more detail. For example, you can use the Date dimension in the rows or columns section of a PivotTable or PivotChart report to show a trend over time. You can also use dimensions to filter the report. Place a dimension or dimension attribute in the filter area, and then specify the values that you want to include in the report.Some dimensions are used in more than one measure group. For example, all measure groups share the Date, Team Project, Person, Area, and Iteration dimensions. The following illustration shows the dimensions in the cube.Dimensions are groups of attributes that are based on columns from tables or views in a data source view. Dimensions exist outside of a cube, can be used in multiple cubes, can be used multiple times in a single cube, and can be linked between Analysis Services instances. A dimension that exists outside of a cube is referred to as a database dimension, and an instance of a database dimension within a cube is referred to as a cube dimension.
  34. In business terminology, a key performance indicator (KPI) is a quantifiable measurement for gauging business success.In Analysis Services, a KPI is a collection of calculations that are associated with a measure group in a cube and that are used to evaluate business success. Typically, these calculations are a combination of Multidimensional Expressions (MDX) expressions or calculated members. KPIs also have additional metadata that provides information about how client applications should display the results of the KPI's calculations.
  35. Measures are values that correspond to columns in the corresponding fact table. Also, fields whose reportable attribute is set to Measure appear as measures in the cube.
  36. Each measure group contains measures, such as Work Item Count, and dimensions, such as Date and Team Project. The measures are the numeric values that provide summaries at different levels of aggregation. You can use them in the Values section of a PivotTable or PivotChart report. The following illustration indicates the measure groups for Team Foundation.
  37. By using perspectives, you can view portions of a cube, making it easier to focus on just the set of information that is of interest for creating a report. Note Perspectives are available only when your data warehouse for Visual Studio ALM is using SQL Server Enterprise Edition. Otherwise, you will see only a single perspective, the Team System cube.Each perspective provides a focused view of the data so that you do not have to scroll through all of the dimensions and measure groups that are defined for the whole cube. A perspective is a subset of the features and objects of a cube.