It’s no secret that I’m a big fan of static analysis tools, believing that they can provide a very useful second opinion on my code, helping me to deliver high-quality code that does what it is needed.

One of the promises of the Roslyn compiler project was that it would dramatically lower the bar for tooling that needs to understand the semantics of code, resulting in an explosion of new and innovative tools.

Here are three static analysis tools that I’ve recently found to be most useful. This list covers only tools for working with C# in Visual Studio 2017.

Roslynator is an epic set of >170 analyzers and >180 refactorings. I’ve been using the Roslynator.Analyzers NuGet package and I’ve found most of the advice to be useful and actionable. Note that this package includes only the analyzers - you need to install one of the available Visual Studio extensions to get the refactorings - I’ve recently installed Roslynator Refactorings 2017 and, so far, it’s looking very useful.

Microsoft.CodeAnalysis.FxCopAnalyzers is a meta-package that pulls in a number of other analysis packages. Together these replicate the majority of the rules defined by the original FxCop code analysis tool, a familiar part of Visual Studio installations for years.

StyleCop.Analyzers is a reimplementation of the classic StyleCop code formatting tool, based on the new Roslyn runtime. These rules are very pedantic (almost, to a fault) but can still be worthwhile - especially if you take the time to customize the rules using stylecop.json.

Taming the tools

One of the real problems with tools like these is that they often will recommend things you disagree with - or things that just aren’t important to your current project. Another problem is when you add these to an existing project and get literally thousands of messages to handle.

The key to effective use of these tools is to configure them so they give you useful diagnostics while not bugging you with diagnostics you’ve decided aren’t worthwhile. You can suppress a message by adding an appropriate attribute to the method or class, or globally suppress a rule by creating a GlobalSuppressions.cs file to list those attributes.

The old school way of doing this - which is still supported by projects targeting the full .NET Framework - is to define a ruleset file that is referenced by each project. Unfortunately, this doesn’t work with .NET Core projects. I don’t know whether this is purely an interim issue that will pass as .NET Core tooling matures.

To add suppressions, begin by selecting Suppress | In Suppression File from the context menu available for the message in the Errors sidebar. Then edit for readability (by default the entire attribute shows on one line) and add a Justification for future reference:

[assembly: SuppressMessage(
    "Microsoft.Naming",
    "CA1715:IdentifiersShouldHaveCorrectPrefix",
    Justification = "Favour single character capitals for generic type parameters.")]

Including the justification is important as it serves to advise any future developer why the rule was originally suppressed. (Don’t forget that in six months you might not remember the reason yourself; adding in a justification works to remind you as well.)

You’ve probably already identified the downside of using attributes: the need for each project to be individually configured. One way to mitigate this is to share a single GlobalSuppressions.cs file across all the projects.

Here are some more examples:

[assembly: SuppressMessage(
    "Microsoft.Naming",
    "CA1707:IdentifiersShouldNotContainUnderscores",
    Justification = "Use underscores to separate clauses in test names")]

[assembly: SuppressMessage(
    "StyleCop.CSharp.ReadabilityRules",
    "SA1101:Prefix local calls with this",
    Justification = "Avoiding cluttering the code with 'this' unless necessary.")]

[assembly: SuppressMessage(
    "StyleCop.CSharp.DocumentationRules",
    "SA1633:The file header is missing",
    Justification = "Don't clutter source files with headers.")]

By suppressing the rules you don’t care about, you ensure the messages generated by code analysis tools are useful, helping you to improve your code.

Comments

blog comments powered by Disqus
Next Post
In defense of XML  25 Apr 2017
Prior Post
What's the value of a failing unit test?  08 Apr 2017
Related Posts
Using Constructors  27 Feb 2023
An Inconvenient API  18 Feb 2023
Method Archetypes  11 Sep 2022
A bash puzzle, solved  02 Jul 2022
A bash puzzle  25 Jun 2022
Improve your troubleshooting by aggregating errors  11 Jun 2022
Improve your troubleshooting by wrapping errors  28 May 2022
Keep your promises  14 May 2022
When are you done?  18 Apr 2022
Fixing GitHub Authentication  28 Nov 2021
Archives
April 2017
2017