When looking at the code quality of an application and willing to decide of an action plan, it is sometimes not sufficient to get a snapshot of the quality at a given time. You also need some visibility on the application history to take the right decision.
For example, let's say that you have defined a requirement that each application should have a coverage by unit tests of 60%. And let's say that one of you application has got 50% coverage. Without historical data, you can only conclude that the team does not write sufficient unit tests and that it should become the next focus of attention. But let's now say that a couple of months ago the coverage for this application was 20%... then you would probably conclude that although the requirement is not made yet, the team is doing a very good job and that no special focus needs to be put on unit tests. This is why having access to historical information is important.
The TimeMachine is made of two generic widgets that can be instantiated and configured in any dashboard.
The Timeline widget provides the capability to display a chart containing historical data of up to 3 metrics. Passing your mouse over the timeline will display the different values.
This widget can be customized:
History Table Widget
The History Table widget provides the capability to display a table with the historical data up to 10 metrics:
This widget can be customized:
A default TimeMachine dashboard comes out-of-the-box when installing Sonar. This is a combination of Timeline and History Table widgets:
What are Tendencies?
The tendencies are arrows that are displayed next to metrics in the dashboards. Those arrows show the trend for the measure.
How to Read Tendencies?
Sonar uses 5 levels to describe the tendency of a measure. Each level is represented by an arrow:
Sonar uses black ( ) arrows to represent tendencies on the quantitative metrics (the ones that are not reflecting quality of the code, for example number of lines of code).
Sonar uses red ( ) or green ( ) arrows to represent tendencies on the qualitative metrics (the ones that are reflecting quality of the code, for example code coverage). The red is used when the quality decreases, the green when it increases.
Of course, it is to be noted that if the percentage of duplicated lines decreases it will be represented in green by because it is considered as an improvement.
How are Tendencies Calculated?
To compute the tendencies, making a simple difference between the last two measures of each metric would not be accurate enough. Therefore Sonar implements a more advanced algorithm: the least squares method. The least squares is a linear regression analysis that helps removing the noise in order to determine a trend on discrete measures. In other words, Sonar takes all the measure taken in the last XX days, checks that the set of measures makes some sense (by testing the correlation rate), determines an estimated slope and displays it using the arrows.
It is possible to configure the tendencies by logging in as an administrator and going to Configuration > Differential Views.
- To activate/deactivate the computation of tendencies, set the Skip tendencies property to true/false. Default is true.
- To configure the number of days used to compute tendencies, set the Tendency period property to the desired number of days. Default is 30.
Delete a Quality Snaphshot
Whatever the reason (wrong quality profile, issue with analysis, etc.), it is possible to remove a snapshot to polish the TimeMachine widgets:
Removing Useless Data
When you do a new analysis of your project, some data that were present in Sonar will not be accessible anymore. For example the source code of the previous analysis, measures at file level... Those data will automatically be removed at the end of a new analysis.
Removing Un-necessary Analysis
This is very useful to analyze frequently a project in Sonar to see progress on quality. It is also very useful to be able to see the trends over weeks / months / years. But when you are back in time, you do not really need the same level of details as you would for one week ago. Therefore, since Sonar 2.5, a new "Database Cleaner" has been added to Sonar. It aims to delete some rows in database to save some space and to improve overall performances. Here is its default configuration :
- it deletes all data older than 5 years
- it keeps a single monthly analysis by project over 1 year
- it keeps a single weekly analysis by project over 1 month
- analysis with an event are kept
These settings can be changed in the page Configuration > Settings > Database Cleaner.
Deprecated - Previous Version of the TimeMachine
Exploring the TimeMachine
Wherever in a project, it is always possible to access the Time Machine functionality through the left menu.
The Time machine enables to reply the past by looking at a series of past analysis. The Time machine page is divided into 4 sections.
The Custom Chart
This chart represent a timeline from the first ever analysis to the last ever analysis. On this timeline will appear all the metrics that are ticked below
To change the metrics appearing on the chart, tick the metrics you are interested in and press at the bottom of the page. The chart is then regenerated with the chosen metrics.
Every event, either generated or manually entered appear on the chart as vertical doted lines.
It possible to change the default metric appearing on the chart by clicking on Set as default.
The Chosen Analysis
When launching the Time Machine, Sonar displays the first snapshot ever and the last 5 events in the system.
It is possible to add snapshots to the view : snapshots can be selected either by date
or by events (only version events will be listed)
It is possible to hide a snapshot from the view by using the hide link next to the snapshot to be removed
The Metric Table
On the left of this table will appear the list of all possible metrics (grouped by families) stored in the system, including the manual metrics. Each column represents a snapshot and will display the measure for the metric. At the right of the table appears a sparkline showing the evolution of the chosen snapshots.
The Complexity Chart
This chart compares the repartition of the classes complexity between all the chosen snapshots.