Integrating Red Gate SmartAssembly Into TFS 2010 Build



Updates – I plan on making updates to this blog article from time to time as I learn more and changes are made to TFS & SmartAssembly to smooth out some of the rough edges.  Come back again for the latest & greatest!

I really like tools that provide assistance with the release management & maintenance cycles of ALM.  I think some of the features really add a particular shine to your application.  It’s something that I’m currently writing an article about but I wanted to share how to integrate one of those tools into your Team Foundation Server 2010 Build Process.  This first tool to be reviewed is Red Gate’s SmartAssembly product.Red Gate SmartAssembly

SmartAssembly is a product that can help you out with obfuscation if you need it but I primarily want to focus on two of its other major features:

  • Automated Error Reporting – When an exception occurs, the end user can be prompted to send back the exception details so that the development team can review those details.  This can also work with server-side & web applications without requiring end user prompting.
  • Feature Usage Reporting – This is essentially telemetry for your application to figure out how your users are using the application so that you can make good decisions in the future about where to invest for future releases.  It does this by sending back anonymous data for users who opt-in at runtime.  It will even automatically send back data about the machine the software is being run like the operating system.  This can be extremely useful data to product managers.

You don’t have to worry about any of the details because once you run your assemblies through SmartAssembly, it instruments all of the necessary functionality automatically for you.  If you acquire the Professional edition, you can customize the reporting experience including the ability to host your own web server to accept the error & feature usage reports.

Aside:  Too many teams & companies have blindly adopted obfuscation for their assemblies in the past without taking into consideration the true “total cost” of obfuscating your applications.  I’m all for obfuscation where it makes sense to protect IP as long as the value of the protection of that IP is worth more than the extra cost, resources, and maintenance complexity to truly support an obfuscated product.  Each team & company is going to have to make that decision based on the resources available and the value of the IP to be protected  – just don’t go into it blindly.

FYI – PreEmptive’s Dotfuscator tool is a competing product line with a similar feature set that I hope to be covering in a future blog post.

Ignoring obfuscation, these two features are absolutely great for gaining visibility about your application once it has been released.  For all of those teams that aren’t traditional software vendors but building applications for internal use, these are great features for those applications as well.  Software engineering teams building internal applications are very much in need of the same type of information as ISVs about how their internal “customers” are interacting with their applications.  Internal applications don’t necessarily need obfuscation but they can definitely benefit from automated error & feature usage reporting!

One part that I absolutely love about SmartAssembly is that even though the tool instruments and changes your assembly, it also provides the ability to produce a set of matching symbols (.PDBs) that are extremely important for several scenarios in TFS, the Visual Studio ALM family of tools, as well as basic debugging.

I am going to be spending some time in this blog article to walkthrough how to integrate SmartAssembly into your automated TFS build process so that your teams can take advantage of these features.  I am going to take the approach of not creating any custom workflow activities for this particular effort.  Jim Lamb has a good discussion about when to make customizations to the MSBuild file (essentially the Visual Studio project file) and when to make your customizations in the Windows Workflow-based build process template.  As much as I very much prefer customizing my build process templates using custom workflow activities, in this case I choose to do a little customization of both without using any custom workflow activities.  I would much rather have done this using only native Windows Workflow activities but I’ll talk more about that a little later.

Disclaimer:  As a Microsoft MVP, I have been a part of the Friends of Red Gate group for the last four years and I have been provided Not For Resale licenses of the Red Gate family of products though I reserve the right to offer unbiased opinions and criticisms.  I was not paid for these contributions.  However, I may or may not get a complimentary round the next time I see the Red Gaters at the pub in Cambridge. Smile

Works on My Machine LogoWorks on My Machine Disclaimer:  Everything in this blog article works on my machine when I wrote it.  I have the latest version of SmartAssembly and TFS 2010 installed & configured correctly.  I’ve done my best to make this as reusable as possible for most team’s scenarios but I can’t tell you that it will work for you.  Hopefully it gets you started on the right path though!  Please don’t contact me and let me know that my code killed your cat.  I feel for you… I do – I just can’t do anything about it.  You’ve been warned.  I take the same approach that Scott does with blog contributions.

Configuring SmartAssembly for Team Use

SmartAssembly has actually been designed out of the box to handle the single-developer team scenario.  If you are using TFS, you are likely not a single-developer team so you’ll want to a few things to get SmartAssembly setup for use with a team.  The architecture for SmartAssembly can best be described with this architecture diagram:

Smart Assembly Architecture Diagram
Sourcehttp://www.red-gate.com/products/dotnet-development/smartassembly/team-package

You’ll need to get the Professional edition of SmartAssembly since it allows you to store everything in a shared SQL Server database.  One nice thing is that each developer who will need to interact with error & feature usage reports only needs a Developer edition license instead of a full Professional edition license.  You’ll need to install & configure the Professional edition on each of your build servers.  You might as well go ahead and create a build agent tag called “SmartAssembly” to indicate which build agents in your build farm are hosted on servers that have SmartAssembly installed.

When you first start SmartAssembly, you will want to setup the desktop machines & build servers to use the same SQL connection settings for the shared SmartAssembly database.  I even like to use the friendly TFS DNS names that I already have setup for my particular TFS environment.  Remember that if you are using the limited use license of SQL that is included with TFS, you won’t be able to house the SmartAssembly database on that instance.  You’ll need to purchase a legitimate SQL Server license.  It’s a great time to upgrade to the SQL Enterprise edition if you can for TFS!  TFS will definitely take advantage of several of the features.

It is pretty easy to setup from there:

Configuring SmartAssembly Database Connection 

Be sure to also indicate that you want to use relative paths.  Relative paths will be very important when you are using it in a team environment with Team Foundation Server.

BTW, if you need to setup SmartAssembly to use SQL Authentication instead of Windows Authentication, you can do that using this particular article. You do this by basically updating the settings configuration file available on a Windows 7 machine at C:\ProgramData\Red Gate\SmartAssembly\SmartAssembly.settings.

Creating & Storing the SmartAssembly Configuration File in Version Control

I am going to make this easy by just using a quick Windows Forms application however you are able to process any type of assembly including Silverlight apps, ASP.NET web applications, class libraries, etc. using SmartAssembly.

You will want to compile your assembly at least once and then start a new SmartAssembly project.  It actually doesn’t matter where the source & destination location of the assembly is set to in the configuration but you might want to pick a location that all of the developers will be using.  Don’t worry about the build server locations because we will override those later in the build process!  To keep it simple, I’m only going to enable the following features in my SmartAssembly configuration file:

  • Automated Error Reporting,
  • Feature Usage Reporting, and
  • Generate Debugging Information

You can research more on the other options that are available but I am going to keep this walkthrough very simple.  Once you are satisfied with your settings, click the “Save As…” button and save the configuration file in the same folder as your Visual Studio project file.  I even like to include the file in my Visual Studio project so that I can work with it and check it into the version control repository along with the rest of my project.  The SmartAssembly configuration file has a “.saproj” file extension.

Visual Studio Project with Smart Assembly Configuration File

The next thing you might want to do is open the configuration file using the XML Editor in Visual Studio to verify all of the settings look correct. You can use the “Open With…” context menu command from the Solution Explorer window to help you out.

Using the Open With Command in Visual Studio Solution Explorer

The main thing you want to do is be very mindful of using relative file paths everywhere in the configuration file since the location of the source code location changes on the build server & developer machines.  For example, TFS Build allows you to have multiple build agents running on any build server.  I might have three build agents on a build server which means three builds could be running at any given time on the build server.  You isolate each build agent on a build server by setting the working directory to something that will be a unique value.  The default setting is $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionName) but I usually change it to $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionId) to give me a few extra characters since we also have path length limitations to go up against.

SNAGHTML5088658

Defining Custom MSBuild Properties

At this point, we are going to define a few custom MSBuild properties that we are going to use to trigger the SmartAssembly functionality.  The table lists the properties I am going to define in this process.

Property Name Value(s) Description
TfsBuild True, False Indicates whether this build is occurring using TFS.
RunSmartAssembly True, False Indicates whether the SmartAssembly processing should occur after compilation.
SmartAssemblyConfigurationFileRelativePath <Relative File Path> Stores the relative path location to the .saproj configuration file for the project.

Modifying the Visual Studio Project Files

For many of the common project types, Visual Studio project files are in fact actually MSBuild scripts under the covers.  What we are going to do is add some custom functionality at the end of the project file that we will later “turn on” during the build process.  You could modify this so that you could “turn on” the functionality at development time locally but this additional script excerpt will leave it turned off during normal development.

To edit a Visual Studio Project file, you can “unload” the project from the context menu in Solution Explorer and then double-click it to open it in a new editor document window.  You will add the following excerpt close to the bottom of your Visual Studio project file just before the final </Project> ending tag. In my case it is a .csproj file.

<!-- Red Gate SmartAssembly Custom Post-Compile Processing for TFS Builds -->
<UsingTask TaskName="SmartAssembly.MSBuild.Tasks.Build" AssemblyName="SmartAssembly.MSBuild.Tasks, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7f465a1c156d4d57" Condition="'$(TfsBuild)' == 'True' and '$(RunSmartAssembly)' == 'True'" />
<PropertyGroup Condition="'$(TfsBuild)' == 'True' and '$(RunSmartAssembly)' == 'True'">
  <!-- Uncomment this next line if the configuration file is not located in the same directory and uses the same name as the project. -->
  <!--<SmartAssemblyConfigurationFileRelativePath>SmartAssemblyConfigurationFileName.saproj</SmartAssemblyConfigurationFileRelativePath>-->
  <!-- This will set the default name of the configuration file to the same name as the project name if the property is not defined elsewhere. -->
  <SmartAssemblyConfigurationFileRelativePath Condition="'$(SmartAssemblyConfigurationFileRelativePath)' == ''">$(ProjectName).saproj</SmartAssemblyConfigurationFileRelativePath>
</PropertyGroup>
<Target Name="AfterBuild" Condition="'$(TfsBuild)' == 'True' and '$(RunSmartAssembly)' == 'True'">
  <!-- Archiving the original compiled assembly and matching debugging symbols file. -->
  <Message Text="Archiving the original compiled assembly and matching debugging symbols file." />
  <Copy SourceFiles="@(_DebugSymbolsOutputPath)" DestinationFolder="$(OutDir)Original" Condition="'$(_DebugSymbolsProduced)' == 'true' and '$(CopyBuildOutputToOutputDirectory)' == 'true' and '$(SkipCopyBuildProduct)' != 'true'" />
  <Copy SourceFiles="@(MainAssembly)" DestinationFolder="$(OutDir)Original" Condition="'$(CopyBuildOutputToOutputDirectory)' == 'true' and '$(SkipCopyBuildProduct)' != 'true'" />
  <!-- Process Assembly through SmartAssembly -->
  <SmartAssembly.MSBuild.Tasks.Build ProjectFile="$(SmartAssemblyConfigurationFileRelativePath)" Input="@(MainAssembly)" Output="@(MainAssembly)" OverwriteAssembly="True" />
</Target>

It is a modified version of the snippet from the SmartAssembly help documentation for integrating with MSBuild:  http://www.red-gate.com/supportcenter/Content/SmartAssembly/help/6.5/SA_UsingSmartAssemblyWithMSBuild.  You’ll see a little later where we are going to “turn on” the functionality by editing the TFS build process template.  If you named your configuration file the same name as the project name and stored it in the same location in version control you actually don’t need to modify anything in the snippet at all.

Notice that the snippet keeps the original copies of the assemblies and matching symbols (.PDB) file so that they later get copied to the TFS build’s drop folder.  It is copying the original assembly and matching symbols into another subdirectory named “Original” instead of just outputting the SmartAssembly instrumented assembly & matching symbols to a subfolder called “Obfuscated”, “Instrumented”, or “Protected.”  I used to use the latter approach (as some people suggest) but if you are also compiling installers, it is useful to create an installer during specific builds that include the original assemblies instead of the instrumented ones.  In my installer definition (like a WiX file) I’ll just refer to the regular location and it will pickup whatever version the build process created.  If I want an installer to have the original assemblies then I just queue a new build and will set the SmartAssembly process parameter to false for that build.  I don’t have to do anything additional in my WiX definition files to handle this scenario. 

Another side effect you get by using this approach is that if your build process runs any automated tests, static code analysis, test impact analysis, etc., then it will use the instrumented versions of the assemblies as the target of the tests and other post-processing tools!  There are several ways to skin this particular cat but I have fallen back to this approach after a few years of dealing with these issues.

Modifying the Build Process Template in Windows Workflow Foundation

Technically, we could just hard-code the extra MSBuild process parameters that we need using the default TFS build process template on the Process tab of the build definition editor window:

Setting Additional MSBuild Property Values in TFS 2010 Build Definition Editor

If you are okay with this approach then you don’t really need to go any further.  However, we could make this a richer experience for people who will edit and queue these builds from day to day.  This is where we can go through and create a custom process template.

The first thing you will want to do is create a new build process template to start your customizations.  I have included mine for download at the end of this blog post but you may want to walk along.  I usually start by creating a copy of the default build process template available from TFS.  If you aren’t familiar with the basics of this particular process, I would highly suggest going through the walkthrough in either of these books:

You can then change your build definition over to the newly copied build process template using the following combo box.

Selecting a New TFS Build Process Template

If you click on the hyperlink, it will take you to the location in Source Control Explorer where you can get the latest version into your workspace and then open the build process template file for editing in the Windows Workflow Foundation Designer.

Defining Build Definition Process Parameters

The first thing we can do is specify a new build process parameter that is exposed to the end user of the builds by going to the “Arguments” tab in the lower left-hand corner of the Workflow designer.

Arguments Tab for Windows Workflow Designer

I am going to create a Boolean process parameter simply named “RunSmartAssembly” and set the default value to False.  This isn’t an MSBuild property but a workflow process parameter that will be exposed to the end user when they are queuing a new build or when editing the build definition.

Creating New TFS Build Process Parameter

This next step is just to make things that much nicer.  We can give the TFS Build system some additional metadata to make sure the parameter is exposed to the end user in a nice fashion.  There are more details about the process parameter metadata field in either of the book chapters mentioned above in case you would like to learn more!  You edit the collection information for the Metadata parameter that is already defined in the default build process template.  (It’s two above the parameter we created in the previous screenshot.)  Just click the ellipsis button in the default value field column to open up the metadata editor window.

TFS Build Process Parameter Metadata Editor

Fill out the details as indicated above and save your build process template. You won’t see the changes immediately if you were to go back to the build definition editor because we haven’t checked-in the build process template back to the version control repository yet.

Verify SmartAssembly is Installed on Build Server

Whenever I architect a build that requires the use of a custom tool and it isn’t stored in version control (or even if it is but someone forgot to add that workspace mapping) I usually want to add a check in the build process to make sure that the tools are actually available to the build server.  If the check doesn’t locate the tool I have it give a nice build error.

Add an If workflow activity inside the Build Agent Scope activity (labeled “Run on Agent”) but before the section that starts the compilation.  It doesn’t exactly matter where as long as you get them in the agent scope but before any type of compilation begins.  I am going to set my condition to something like the following:

RunSmartAssembly AndAlso Not System.IO.File.Exists(String.Format("{0}\{1}\{2}", Environment.GetFolderPath(Environment.SpecialFolder.ProgramFiles), "Red Gate\SmartAssembly 6", "SmartAssembly.exe"))

You can then add a Write Build Error activity with an appropriate message to indicate that SmartAssembly was not found.  It should look something along the lines of this following example.

Verifying SmartAssembly is Installed on TFS Build Server

Appending Additional MSBuild Properties

We can now work on passing in the additional MSBuild properties.  I’m going to do this in two steps.  The first step is to append the TfsBuild MSBuild property to the pre-defined workflow variable that is used for this purpose named MSBuildArguments.  I’m going to do this immediately after the workflow activities we added for the previous step using another native primitive workflow activity:  Assign.  It’s a super simple activity that is great for this particular purpose.  The assignment expression that I am going to use for the Value parameter is:

String.Format("{0} {1}", MSBuildArguments, " /p:TfsBuild=True")

image

After that, we will add another If activity where the conditional will be set to the RunSmartAssembly workflow parameter we created earlier.  We will also add add another Assign activity and append our remaining MSBuild property to pass into the compilation process.  You can use this assignment expression for the Value parameter of the Assign activity:

String.Format("{0} {1}", MSBuildArguments, " /p:RunSmartAssembly=True")

The final sequence looks similar to the following screenshot.

image

You may be asking “Why did we define the $(TfsBuild) MSBuild property when we could have just used the $(RunSmartAssembly) property?” That’s a great question… You don’t need it if you aren’t going to do any additional customization. However, in general, I like to always define the $(TfsBuild) MSBuild property so that you could customize the project files to modify the conditions based on whether it is occurring during a TFS Build or if it’s occurring on a developer’s machine. It’s quite handy when you need it.

Notice that we are also performing all of the SmartAssembly processing steps before the Source Server Indexing and Symbol Server Publishing phase of the build process so that both the original symbols and the symbols that match the instrumented assemblies are published correctly to Symbol Server and have the appropriate indexing for Source Server support included in those symbols.  That will be extremely useful later whenever you need to debug against either the original or instrumented assemblies in the future.  You can also open IntelliTrace log files & take advantage of Test Impact Analysis if you keep obfuscation turned off in the SmartAssembly configuration.

Finale

That’s it!  Just save the changes to your build process template and check the file into the version control repository so it can now be used by your build definitions.  Be sure to set your new custom workflow parameter to True and then queue a new build!

Setting Custom SmartAssembly Process Parameter

You’ll now notice that it runs correctly even if you have defined for your build process to compile multiple build configurations (i.e. Debug | x86, Release | AnyCPU, etc.)

Potential Improvement Areas

  • Licensing & Activation for Build Servers – Unfortunately, the way SmartAssembly is licensed you have to purchase a license for each of the build servers you might have and activate the software on those build servers.  The accompanying side effect is that the developer licenses are cheaper.
    This can be problematic in a TFS environment where you might have a build farm that has one build controller with 20 build servers that have three agents on each of those build servers.  It’s not that SmartAssembly would be used at the same time on all 60 of those build agents but you also don’t know which build agent will be reserved for a particular build at any given time.  I have resorted to using the build agent tagging feature of TFS Build to handle making sure particular builds only reserve an agent with SmartAssembly configured & activated.  However, this causes a complete underuse of the hardware resources available in a build farm. 
    I would rather tool vendors achieve their revenue targets by increasing the per-user license fee because and specifically for users who benefit from the advantages that the particular tool brings to them. This licensing model if very similar to how the Visual Studio & third-party components licensing model works.  Microsoft and other third-party component vendors give you the ability to install and use the their tools on a build server without charge.
    I consider build agent machines throw-away machines.  They should remain completely clean but don’t need to be backed up or monitored.  I usually will have a virtual machine base image that has everything already installed & ready to go so that I can add/remove to the build farm “pool” as needed.  I even prefer to throw away machines after 30 days and bring new build agents online to ensure the whole build farm is kept as clean as possible.  When you have tools that require activation & licensing, this scenario quickly becomes problematic.  This leads me to another potential area for improvement.
  • Installation on Build Servers – If you know me well, this is a slightly less critical criticism than the first bullet point but also a pretty big pet peeve of mine. Smile  If you make tools, please don’t require them to be installed on the build server. It’s another thing that has to be kept up to date on potentially many machines and in a base system image.  I would rather be able to check them into a known version control folder and then have the build servers download the latest version during the build process.  There is even a supported mechanism in TFS Build that allows the build controllers & agents to watch for custom assemblies & tools and whenever it notices a new version of those assemblies then it gracefully updates all of the machines in the build farm automatically.  This allows team members to focus & introduce changes to the tools using version control instead of having to update the base image of the build server every time there is a new update.
    You also benefit from having full auditing of what exact tools version were used to produce a specific set of assemblies.  That allows you to potentially recreate a build you created a year ago by simply specifying what version of the source code (including build tools) to use during that build process.
  • Native Workflow Activity for TFS 2010 Build Process Templates -  The process I described in this blog article is definitely much more difficult than what it could be.  Instead of introducing customizations in the MSBuild-portion of the TFS build process, I much rather prefer dropping in a native workflow activity after the compilation process.  SmartAssembly unfortunately doesn’t have a custom TFS build workflow activity at this time.  I would love to see one that allowed me to specify multiple assembly inputs for each build configuration that occurs in the build process and then the appropriate SmartAssembly configuration file for each of the assemblies.  You can do some nice things with it to really make this process super easy.
  • Database Endpoint Instead of a Web Service Layer – SmartAssembly requires the entire team to have access to the centralized database to manage the automated error & feature usage reports.  The software makes direct database calls instead of going through a service layer is which is very different from the way that tools built for TFS are designed in general.  This can be problematic especially if you have TFS setup for your team to be able to access remotely over HTTPS (port 443) without the use of a VPN.  Several IT organizations,really don’t want to open their database ports or even give access to production database instances.  My suggestion would be to have an intermediate service layer that can “integrate” with the existing TFS IIS web sites.  This allows the tool’s service layer to piggy back on the existing infrastructure already setup for TFS.  If you have an SSL certificate and HTTPS configured, then you can take advantage of it.  If you have load balancing setup for scalability, then you could potentially leverage that as well!  We did this with our Notion Timesheet for TFS tools and one of the benefits we end up getting is that we are able to access the service layer from anywhere we can access TFS including over the Public Internet.  No worries about giving people access to the SQL Server instance as well.
  • Source Server Support – This isn’t necessarily a TFS-specific topic but really something for anyone using build servers & Source Server indexing.  When you compile on a build server, the location of the source code is included in the symbol information.  Your developers will normally not download the source code to the same location as other developers and particularly not the same location that the build server does since that changes depending on what TFS build agent is used on a build server for any particular TFS build.  Source Server Indexing helps to combat this particular problem by replacing the physical location with the location in the version control repository including the branch and version of the code used.  SmartAssembly has a feature that allows you to review details of stack trace, object values, etc. when you open an error report.  However, it doesn’t use the Source Server information even if it is stored in the symbol files.  This is particularly a problem when you are in a TFS environment and using automated builds.  SmartAssembly just ignores those additional streams in the symbols file.  SmartAssembly should use the Source Server information if it exists in the symbols to pull the appropriate version of source code from the version control repository.  (Red Gate Support Ticket Number:  F0041570)
  • Additional ALM Integration with TFS – There are so many different areas where SmartAssembly could shine if it had some additional ALM-specific integration with TFS!

Download Process Template

If you are interested in downloading the completely customized version of the build process template, I have included a link to it below.

Download SmartAssembly Process Template

 

 

Take care,

Ed Blankenship



Source Server and Symbol Server Support in TFS 2010



As Jim Lamb announced in June 2009, TFS 2010 introduces support for Source Server and Symbol Server as part of the default automated build process template. This is a really key feature addition but I have found that many developers ask about why it would be so important and why it would help them. Ultimately, we are starting to have more and more tools that need access to the symbol file information and the original source code that was used for compilation. For example, some of the tools that come to mind are:

By setting up Source Server and Symbol Server support during your build process, you’ll be able to work with assemblies & executables that come from the build servers and still use tools that need information from them.

What are Symbols?

imageJohn Robbins has an excellent blog post to get started about learning what symbols are titled: “PDB Files: What Every Developer Must Know.” I highly recommend you take a moment to read through it.

So to summarize from John’s article, the symbol files are the .PDB files that match a particular assembly and contain important information that’s necessary for debugging tools. Specifically for .NET assemblies, the symbol files include:

  • Source File Names and Line Numbers
  • Local Variable Names

He also reminds us one very important statement about symbol files: “PDB files are as important as source code!” That is absolutely true! I cringe any time I hear from a developer that says “oh, those .PDB files take up so much space so I’m going to delete them.” Ouch – The sad thing is those are developers that keep people like John in business whenever they run into problems in production. Smile Save yourself some time, money, and effort and keep your symbol files around. Not to say that John doesn’t earn every penny but I’m sure his life is much better whenever you do have your symbols!

This is exactly where Symbol Server helps out. Essentially, the Symbol Server is a central location for your company that keeps the .PDB files for you. Therefore, you can install your application (without symbols) that was compiled from a build server and whenever you want to use a debugging tool like Visual Studio, it will know how to contact the Symbol Server location to get the matching set of symbols. More about how to configure Visual Studio to look for a Symbol Server further down in this blog post.

John also mentions how to manually perform the steps necessary for completing the loop with Source Server and Symbol Server. Thankfully, since you are using TFS 2010 Build, you don’t have to go through those steps. The functionality is included in the default build build process template (but not the Upgrade Template).


Aside: If you are performing obfuscation using your favorite .NET obfuscation utility, you will want to make sure you produce symbol files that match the newly created assemblies. This is because the variable names and other information change by the obfuscator. What I will normally do will do is keep both the original assemblies with their matching symbol files in addition to the obfuscated assemblies with match symbol files. I store the artifacts for the obfuscated assemblies in a sub-folder called “Obfuscated.”

imageimage


How to Setup Symbol Server

A common misconception about Symbol Server is that you actually have to set up a server and install the Symbol Server software. Not at all! All you have to do is setup a file share on another server. If you are using my suggestion about using friendly DNS names with TFS, you might extend that for the symbol server as well:

\\symbols.contoso.local\Symbols

On my particular demonstration machine, I have a local file share that contains some of the symbols that were published from my TFS 2010 Builds:

image

How to Configure Build to Index for Source Server and Publish to Symbol Server

Configuring the build definition to use the new Symbol Server location, couldn’t be easier. Open up the build definition editor and navigate to the Process tab. There, you will see all of the process parameters. If you are using the default build process template then you will find the Source Server and Symbol Server settings underneath the “2. Basic” category as shown below.

image

The build process will then do all the work for you!

Source Server Indexing

What actually happens when the build process is actually running it’s Source Server indexing? Let me first start by discussing the problems with symbols that come from a build server (or another machine.) One of the pieces of information that is stored inside of the symbol file is the location of the original source file that was used for compilation into the assembly you are debugging. This can be a problem because for my particular case, the local location of the source code file on the build server is:

C:\LocalBuilds\1\2\Sources\Source\Calculator\Calculator\Form1.cs at version 32 from the MAIN branch

Not only do you to have put all of the source files in the same exact spot but you would have to get it from the right branch and even the exact same changeset version from the TFS version control repository. That’s a lot of manual work… This is where the indexing for Source Server helps you out. You’ll also notice that if you are producing symbols from your obfuscation utility, those can indexed for Source Server support as well.

image

When the TFS 2010 Build runs the source indexing for Source Server, it writes an alternate stream of information in the symbol files that will provide the following information for each source file:

  • Source Control provider’s information and the command-line utility to use to get the file (In our case that would be using tf.exe)
  • Full TFS Version Control Repository Server Path including the branch name
  • Version

The default build process template uses the srctool.exe command-line utility first to list all of the local source file locations that are stored in the symbol file. Then, it generates a temporary file that contains the exact alternate stream information for Source Server. The Source Server stream is named srcsrv. Finally, the build process uses the pdbstr.exe command-utility to add that stream information to write the relevant information. If you are ever curious about what that srcsrv stream actually contains, you can run this command-line utility:

C:\Builds\Calculator MAIN\Calculator MAIN_11.02.11.06\Debug\Obfuscated>pdbstr.exe –r -p:Calculator.pdb -s:srcsrv
SRCSRV: ini ------------------------------------------------
VERSION=3
INDEXVERSION=2
VERCTRL=Team Foundation Server
DATETIME=Fri Feb 11 00:41:58 2011
INDEXER=TFSTB
SRCSRV: variables ------------------------------------------
TFS_EXTRACT_CMD=tf.exe view /version:%var4% /noprompt "$%var3%" /server:%fnvar%(%var2%) /console >%srcsrvtrg%
TFS_EXTRACT_TARGET=%targ%\%var2%%fnbksl%(%var3%)\%fnfile%(%var5%)
SRCSRVVERCTRL=tfs
SRCSRVERRDESC=access
SRCSRVERRVAR=var2
VSTFSSERVER=http://localhost:8080/tfs/DefaultCollection
SRCSRVTRG=%TFS_extract_target%
SRCSRVCMD=%TFS_extract_cmd%
SRCSRV: source files ---------------------------------------
C:\LocalBuilds\1\2\Sources\Source\Calculator\Calculator\Form1.cs*VSTFSSERVER*/Calculator/MAIN/Source/Calculator/Calculator/Form1.cs*32*Form1;C32.cs
C:\LocalBuilds\1\2\Sources\Source\Calculator\Calculator\Form1.Designer.cs*VSTFSSERVER*/Calculator/MAIN/Source/Calculator/Calculator/Form1.Designer.cs*30*Form1.Designer;C30.cs
C:\LocalBuilds\1\2\Sources\Source\Calculator\Calculator\Program.cs*VSTFSSERVER*/Calculator/MAIN/Source/Calculator/Calculator/Program.cs*30*Program;C30.cs
C:\LocalBuilds\1\2\Sources\Source\Calculator\Calculator\Properties\Settings.Designer.cs*VSTFSSERVER*/Calculator/MAIN/Source/Calculator/Calculator/Properties/Settings.Designer.cs*11*Settings.Designer;C11.cs
SRCSRV: end ------------------------------------------------

Publishing to Symbol Server

Publishing the symbols is the easier part of it. Essentially, the default build process template calls the symstore.exe add utility to publish the symbol files to the specified symbol server path. Additionally, there is some metadata added for the build information in TFS that will specify that symbols were published. This will be useful whenever the build retention policies kick in which we’ll cover further down.

Configuring Visual Studio to Use Symbol Server and Enabling Source Server Support

The next step is for each of the developers to configure Visual Studio 2010 to look for symbols if they aren’t found in the symbol server location for the company. You can get to it by going to Tools –> Options and then the Debugging –> Symbols options pages as shown below. Other debugging tools have similar options.

SNAGHTML23f64f5

The next thing you will want to do is to enable source server support in Visual Studio. You can do that by going to the Debugging –> General options tab as shown below.

SNAGHTML243d0c1

Now, just start using your debugging tool and in my case I have attached my Visual Studio Debugger to the process of my application that came from the build drop folder. Visual Studio gives me a small warning before it attempts to grab the source code from the TFS Version Control repository as shown below. You can see the exact command-line utility including arguments that is used by the debugger to retrieve the correct version of the file. Pure magic…

SNAGHTML226bcf3

Update:  (2/14/2011) John Robbins has helped out by letting us know how we can disable this really annoying Source Server security dialog any time the debugger wants to get something from Source Server.  Thanks John!


Aside: If you notice, in my situation I have a particular problem. Since the TFS 2010 Build services are installed on the same machine as my application tier on my laptop, the default configuration for the build service to connect to TFS used http://localhost. Sad smile That’s not going to be good whenever I have another developer start debugging using the assembly from my build server and the symbols. Their Visual Studio Debugger instance will try to hit localhost on their machine (where the source doesn’t exist).

For this reason, it’s important to make sure when you are configuring the build service to use the fully-qualified friendly DNS name for your application tier server. (Check out the blog post that’s linked to find out more information about this topic).

image


How Does Visual Studio Know Which Symbols Match for the Executable?

You have to always have symbol files that exactly match the assemblies you are debugging. How does Visual Studio know this though? There is actually a GUID that is embedded to both the assembly and the symbol file. You can find out what that GUID is by running the DUMPBIN command-line utility as shown below.

C:\Builds\Calculator MAIN\Calculator MAIN_11.02.11.06\Debug>dumpbin Calculator.exe /HEADERS

Microsoft (R) COFF/PE Dumper Version 10.00.31118.01
Copyright (C) Microsoft Corporation. All rights reserved.


Dump of file Calculator.exe

Debug Directories

Time Type Size RVA Pointer
-------- ------ -------- -------- --------
4D54CC09 cv 69 00003864 1A64 Format: RSDS, {B7C62014-02BD-4F35-9718-104CE8CFB14C}, 1, c:\LocalBuilds\1\2\Sources\Source\Calculator\Calculator\obj\Debug\Calculator.pdb

You can see the GUID highlighted above. If you were to go check out the Symbol Server file share, you can also find the GUID used to differentiate between all of the different versions of the symbol files that are stored for a particular assembly.

Update:  (2/15/2011) I learned something new from Chris Schmich from the Visual Studio Diagnostics team.  He indicated that the PDB age (which is highlighted above in green) is also used to match the symbols.  You’ll notice that the PDB age for all of my symbols is 1 and is appended to the end of the GUID when stored in Symbol Server.  Thanks Chris for the extra information!

image

IntelliTrace Files and Symbol Server

I also wanted to mention that when testers use Microsoft Test Manager and run manual test cases where they have collected IntelliTrace logs, you’ll notice that when you open one of those IntelliTrace logs (for example attached to a bug work item) you will see the Symbol Server location that was collected from the assembly being tested as well:

image

This green-light should be awesome for you as a developer now because you can connect to the Symbol Server location and start debugging using the IntelliTrace log and the Source Server information contained inside of the symbols.

Retention Policies

One other thing to consider: as you have more and more builds performed using TFS 2010 Build, you’ll want to set up your retention policies. The Symbol Server file share can start to go up in size pretty quickly so you can have the retention policies also delete the corresponding symbols from Symbol Server if you choose by setting the “What to Delete” option.

image

SNAGHTML25e109f

You want to also make sure, however, that any “Released” builds should be marked as “Retain Indefinitely” to ensure that the retention policies never delete the symbols (or anything else about the build for that matter!)

image

Summary

There you go! Your developers will be very appreciative whenever all of this is setup. You’ll have a system that stores your symbols for whenever you need them and those symbols will have information to let the debugging utilities know where to grab the original source code from the TFS version control repository.

Ed Blankenship



Microsoft’s Islands in the Stream



There is a great article in the August 1, 2010 edition of the SD Times by Dave Worthington (@dcworthington) about the Visual Studio 2010 ALM tools including Team Foundation Server 2010.  It’s titled Microsoft’s islands in the stream.”  Some really great colleagues in the ALM community have been interviewed and provided some pretty honest feedback from what we have been seeing over the last year or so.  Check it out!

 

Ed Blankenship



Book Review for Wrox Professional Application Lifecycle Management with Visual Studio 2010



During the first week of April, a little package was sitting on my front porch with the first book to be released on the Visual Studio 2010 release that deals with the new Application Lifecycle Management (ALM) features.  For those of you who don’t know, this essentially means the former “Team System” line of products as we were exposed to it in the 2005 and 2008 releases.  Although the entire Visual Studio suite of products is considered something that helps you with ALM, the book primarily focused on Visual Studio 2010 Ultimate, Visual Studio 2010 Premium, Visual Studio 2010 Test Professional, Visual Studio 2010 Lab Management, and Team Foundation Server 2010.  During the Introduction, I even appreciated how the authors discussed about “where Team System went.”  It’s the best explanation of the branding change that I’ve seen to date.

I was extremely excited to start immediately reading the book.  Even though I have been closely involved with the 2010 release as a Microsoft MVP, when I started to read this book my goal was to be exposed deeper in the feature set being introduced in the 2010 release.

At the time of writing this blog post, the book was selling for $34.64 at Amazon.  The suggested retail price is $54.99.  It is currently #7 in the Software Development books category!

Strengths

If you are new to the ALM features in Visual Studio, I felt this book really offered you the ability to get the high-level overview of all of those features.  It’s essentially similar to a “survey” course that you would have taken in college.   It’s 696 pages that ends up going through all of the Visual Studio client and server features at just the right level of detail. There were even some areas that I felt that I learned more about and hadn’t been exposed to heavily in the past.

The architecture features were something that I had hoped to learn the most from.  They have just never been something that I dived into great detail during the 2010 release cycle.   All of the new UML diagrams that are available including the new architecture features like Use Case, Activity, Sequence, Component, Class, Dependency, and Layer Diagrams.  There was a also a great introduction to the Architecture Explorer.

The testing features have really been what has made up a majority of the Visual Studio 2010 release and the book definitely reflects that.  Going through the testing features, I really felt like I understood the end to end story.  It felt very rounded out!  These chapters are where I picked up a majority of the nuggets of information.  I can’t tell you how many times I said “wow, I didn’t know you could do that.”  I also feel like this is a great place to pick up some introductory knowledge about how Visual Studio Team Lab Management fits into the ALM story.  I also kept thinking how great this book would be for the testers on your team that are new to the Microsoft testing platform and Team Foundation Server.

There are so many changes to TFS, I can’t even begin to start describing them.  Thankfully, the book did a great job.  Especially with the revamp of Team Build to use Windows Workflow Foundation.  You can even download the Team Build chapter from the book for free here:  Team Foundation Build.  Other than automated builds, you’ll get a good pass by all of the rest of the new TFS 2010 features and architecture/topology changes.

There was a whole chapter dedicated to debugging with IntelliTrace!  That’s awesome.  I’m very much a fan of IntelliTrace and think that will truly change the way you develop.

Criticisms

I have been hoping to have a book available out there that really only discusses TFS.  The book definitely has a few chapters available on TFS and spends a good amount of time but that discussion is not the nitty gritty that I think some readers out there are really looking for.  With that said, I don’t think this book was positioned for the “TFS Administrator” exclusively.  Again, I really think this is a survey-level review of the entire ALM stack of features for Visual Studio.  That doesn’t allow you to go into the depths of any particular product.  There currently isn’t a book available for TFS 2010 with the level of detail that I am sure some readers out there are hoping for.  We’ll see what happens in the months to come…

My next criticism isn’t so much for the content of the book as what is media choices are available.  I own a Kindle DX and I imagine a few other techies in the world have some type of eBook reader as well.  I was hoping to have a CD that contained a DRM-free PDF that I could copy over to my Kindle DX whenever I’m traveling and need a quick resource for reference.  Wrox certainly does allow you to get a PDF of books but you have to order them separately even if you had purchased the hard copy.

Finally, the only other thing that I noticed was in that chapter about IntelliTrace (see above) there wasn’t a mention of Symbol & Source Server.  I couldn’t believe it.  There is definitely a discussion later in the book about Team Build’s integration with Symbol & Source server but I was hoping to have seen some more detail in the IntelliTrace chapter about the importance of having them setup for your organization.  You’ll want to put two and two together.

 

Now that I’m finished scrounging from the bottom of the barrel to find some criticisms… :)

My Recommendation

Hands down, get this book.  I think it’s well worth it.   I know each of the authors and it really looks like they put a tremendous amount of effort into writing the book.  The topics are really presented well and at the right level of detail for someone really wanting a crash course in all of the Visual Studio ALM features.  I can’t even tell you how many new nuggets of information that I ran across of things that I didn’t even realize were in the product.

It certainly gets my stamp of approval! :)  Kudos to the authors.

 

Very respectfully,

Ed Blankenship

Microsoft MVP of the Year, Visual Studio ALM and Team Foundation Server



Can I Collect an IntelliTrace Log in Production?



I’ve been hearing this question quite a bit…  “Can I collect an IntelliTrace log in Production?”  This would be a really good idea especially now that there is a standalone command-line utility, IntelliTrace.exe, that you can run to collect IntelliTrace log files.  Unfortunately, it looks like the Visual Studio 2010 Licensing White Paper answers that question for us on page 28:

The IntelliTrace DDA and/or IntelliTrace.exe cannot be used:

  • On a device or server in a production environment.
  • For purposes of system or application monitoring.
  • In non-interactive scenarios other than as part of an automated test or debugging-data collection session.

Bummer! :(  Honestly, I imagine that has to do with something around how IntelliTrace works and Microsoft doesn’t feel comfortable the impact it may have on running Production environments.  Just my conjecture though…

You’ll notice that you can use IntelliTrace in other instances though; most notably on development & test environments!

The IntelliTrace diagnostic data adapter (DDA) and/or IntelliTrace.exe can be used for test and debugging purposes:

  • As part of an interactive test or debugging session.
  • As part of an automated test or debugging-data collection session that is authored by a licensed user and triggered by the same or another licensed user.

You can even share IntelliTrace files between two companies as long as both companies are properly licensed!

IntelliTrace files may be shared among two or more companies as long as all users capturing and debugging IntelliTrace files are licensed with either Visual Studio 2010 Ultimate or Visual Studio Test Professional 2010, depending on the activities they are performing. For example, a company can share IntelliTrace files with an external development consultant. Similarly. a company can use an external company for testing purposes and debug IntelliTrace files provided by that vendor.

Here were the common scenarios mentioned in the licensing white paper.  See if you happen to fit into one of them:

Example 1: Finding a defect in a test environment Company A is building a Web application. All the developers are licensed for Visual Studio 2010 Ultimate with MSDN, and the testers are licensed with Visual Studio Test Professional 2010 with MSDN. During a test run a defect is discovered in the test environment that is difficult to reproduce in a development environment. The test machines have previously been configured with the Visual Studio Test Agent 2010, which includes the IntelliTrace DDA. The tester uses the Microsoft Test Manager to execute the test case with the IntelliTrace diagnostic data adapter (DDA) enabled. When the defect is encountered, the tester files a new bug, with the IntelliTrace files from each of the test machines is automatically attached to the bug. When a developer opens the bug using Visual Studio Ultimate, he or she can open the IntelliTrace files and step through the execution.

Example 2: Working with an external consultant In Example 1, Company A uses an external consultant to help with development. If the external consultant is licensed for Visual Studio Ultimate, he or she can open and debug the IntelliTrace files provided by Company A.

Example 3: Working with an external test vendor In Example 1, Company A uses Company B as an outsourced test vendor. The two companies can work together using IntelliTrace as long as all developers at Company A and all testers at company B are licensed appropriately.

I’m not sure what the minimal technical footprint is to get IntelliTrace.exe to collect an iTrace file just yet but my answer right now will be to have one of these installed:

  • Visual Studio 2010 Ultimate
  • Visual Studio 2010 Test Professional
  • Visual Studio 2010 Test Agents (additional software) <--- probably the smallest impact to a system

If I find out some more information about this scenario, then I’ll be putting together a future blog post!

 

Take care,

Ed Blankenship