I’ve been keeping an eye on the development of Roslyn ever since Anders Hejlsberg announced Microsoft were open sourcing it at Build 2014. For those of you who don’t know what Roslyn is, it is Microsoft’s new C# and VB.NET compiler platform. There are many aspects to it, and if you want a nice overview Schabse Laks has a couple of blog posts on the subject.

Most developers seem to be excited by the new C# 6.0 language and the future of the language going forward – I can see why, C# hasn’t really changed that much recently. Whilst I am interested in the language, another thing that has got me really excited as a C# developer is the Language Service.

Language Service

The language service is what Visual Studio uses to talk to the compiler to understand your code. Previously this was implemented as a series of “hacks” between Visual Studio and the compilers. This made it hard for the Visual Studio team to implement many decent refactoring experiences. The last time anything new really appeared for the “coders” was in VS2010 when the existing refactoring tools were released. Two full versions of VS later, and we still have the same refactoring features. Whilst the C# team have been busy working on Roslyn, products like Jet Brains ReSharper have come along with loads of features of their own, adding refactoring tools and code analysis. But I think times are changing…

For those of you who have installed “Visual Studio 14” or the Roslyn End User Preview, you are seeing the beginning, of what I think, will be a revolution to Visual Studio’s coding experience for C# (as well as VB.NET).

New Features

Here’s a quick overview of some of the features added already:

Visualising using directives that are not needed.

Remove Usings

Here you can see that I don’t need three of my five using directives so they are coloured grey.

Ctrl + . to Resolve

Taking the previous “issue” with my code I can just place the caret on any of the lines with an issue and press Ctrl + . (full stop) to fix it.

Remove Usings Action

You even get a preview of what your class will look like when “fixed”.

Ctrl + . has been used to Resolve issues with your code before (e.g. add using directives), but now it has previews and can suggest fixes to new problems. For example, it could suggest not to use Hungarian Notation and remove the prefix, this would be a lot better experience than the existing Code Analysis CA1704 rule.

Alt + . and Refactoring

Since VS2010, the basic refactoring was done via a series of keyboard combination shortcuts. Ctrl + R, M to Extract Method is the only one I remember, the others I know are there but I can’t remember the keyboard shortcuts. Now everything is under Alt + . and is context sensitive.

So if I select a few statements in a block of code and do Alt + . get the option to Extract Method, select a field and do Alt + . I get Encapsulate Field, select the class name, Extract Interface, you get the idea.

The interface for renaming has changed too to include live preview and errors.

New Method

Above you can see a New Method I’ve extracted and the Live Preview of renaming. If I decided to call “NewMethod” “Foo” I’d get this error because I already have method called Foo.

New Method Error

Code Lens

Code Lens is a heads up display that is already available in VS2013 Ultimate and is powered by Roslyn to provide code and source control information inline in the editor. I’ve only played with this during the CTP of VS2013, but it seemed really promising – I’m now waiting on Microsoft to move it from the Ultimate SKU to Premium.

Visual Studio 14

All the aforementioned features are in VS2013 + End User Preview, Roslyn will actually ship enabled in the next Visual Studio code named “Visual Studio 14”. This has an early preview available and already more refactoring features have appeared. For example, Soma mentions a new Introduce Local refactoring in his blog post.

This sort of stuff isn’t really ground breaking from a developer experience point of view and it’s nothing new for people who use Visual Studio add-ins like ReSharper. But, it is proof that Visual Studio’s Editor is back in the game.

My opinion is that from now until VS 14 RTM ships, the C# and VS team will be focusing their efforts on stabilising the Compiler aspect of Roslyn and making sure everything “just works”, we may see a few more refactoring tools appear in the next update, but I’m not expecting much.

After the RTM

I’m looking forward to when Visual Studio 14 has hit RTM has shipped, what happens next…? The Visual Studio Editor team have time to take a good hard look at what they want to build, the C# compiler team can go back to working on language features. I expect the VS Editor team will be working on some awesome editor experiences, some we might have seen before in other products, others brand new. I cannot see them resting on their laurels once they have the Language Services as a platform to build on, they can do anything they want.


The other thing that really excites me is the community aspect, building your own VS Editor extensions that leverage the Language Services from the compiler is now easy. What ideas will the community build, what would you build? I’m hoping that individual extensions will appear first, and then grow into suites of common functionality, and maybe even be implemented by the VS Editor team in the future.

It doesn’t just have to be extensions, it is also really easy to write your own diagnostics tools. Already there is a String.Format Diagnostic package on the VS Gallery. So, where does this leave products like ReSharper? In my opinion, Microsoft always implement the features most people want, there will still be room for people to create their own extensions suites and JetBrains can charge for a professional package. However, now it will be a lot easier for people to build their own, no longer do you need to write a C# Code Analyser, so there could be some awesome free open source extension suites appearing too.


I think the next few updates for Visual Studio will brings lots of exciting and powerful enhancements to the Visual Studio Editor for both the C# and VB.NET developers. It’s been a long time coming, but I think Visual Studio is back on the front font and ready to innovate some more.


I’m a premium NewsBlur member, and have been ever Google decided to shutdown Google Reader. Mostly my experiences have been very good, with great support from Samuel when I needed it.

However, there has been one issue nagging at me for quite a while and this is, I cannot get a feed to Mark Seemann’s blog. I posted the problem on Get Satisfaction, but Samuel was unable to help due to the size of the feed. A few weeks ago a co-worker of mine mentioned Yahoo Pipes should be able to sort this, so I finally gave it a try. For those (like me until recently) who don’t know what Yahoo Pipes is it’s “a powerful composition tool to aggregate, manipulate, and mashup content from around the web”.

After a few minutes tinkering, I had finally built a “pipe” that took the blog feed, performed a “truncate” operation to 10 posts and output the feed again. I then took the RSS feed to the pipe and handed it to NewsBlur and this time it coped perfectly with the feed.


I’m sure there’s more I can do with it, but for now that’s all I need.

So, if anyone else needs a “Last 10 Feeds from Ploeh.dk blog” you can get the link here.

When you work on projects of a certain size, you will find that you begin to add custom fields to your TFS Work Items that you later want to search on. In my case it was an External Requirement ID field that we use to store the ID of requirements that our customers use to track requirements. I often use these when communicating with our customers, or even members of my team.

For example:

  “Have you checked in A1 yet?” is easier to ask and understand than “Have you checked in 177484?”.

The problem that arises with this approach, is been able to find a work item by its External Requirement ID.

To solve this issue, I once again turned to Linqpad and came up with a script that lets you search by a comma separated list entries against your work items. After a bit of digging I managed to find the correct API to be able to produce reliable TFS Web Access URL’s in the results:


To use the script, just place you collection address and custom field name at the top. You can also add any other filtering into WIQL, for example, you might only want to search a certain Area Path.

When you run the script you will be asked to “Enter External Requirement ID”, just enter the ID i.e. A1 or a list of ID’s i.e. A1, A2 and press Enter.

I keep mine “Pinned” to the Jump List of Linqpad on my taskbar for ease of access.

You can download the script from here:

The script is based on my Linqpad TFS Template, if you need it for any other version of Visual Studio, download the correct template and copy and paste the body of the script between them.

Visual Studio 2013 has a re-designed Team Explorer, and I really like it, but I sometimes find some options, such as “Find Shelvesets” and “New Query”, to be buried away.

For example, the normal path I have to take to find a shelveset is:

  • Pending Changes
  • Actions
  • Find Shelvesets

This is really “clicky” and something I wished I could do quicker.

Today I found out I could right-click on the tiles on the Team Explorer Home Page to get to some of those actions that are buried away:

Pending Changes

From Pending Changes you can get to Find Shelvesets


Work Items

From Work Items you can get to “New Query”


My Work

From My Work you can get to Request Review


Team Members

If you have the TFS 2013 Power Tools installed, then even the Team Members gets the right-click treatment with access to Team Utilities


I recently started working on a new project and decided to use xUnit.net as my unit testing framework. I made a simple test class with a test method, but when I build it, Visual Studio Code Analysis complained with a “CA1822: Mark members as static” warning against my test method.

Let’s assume I have a test for MyClass that looks like this:

public class MyClassTest
    public void TestMyProperty()
        var myClass = new MyClass(1);
        Assert.Equal(1, myClass.MyProperty);


In this version, Code Analysis is complaining because TestMyProperty does not uses any instance members from MyClassTest. This is all well and good in production code, but I assumed (incorrectly) that you must have instance test methods for the test framework to pick them up. As it turns out, xUnit.net is better than that and works on static methods as well as instance methods.

If these were all the tests I needed, I should mark the method as static. If all your test methods are static, you should mark the class as static too – otherwise you will get a “CA1812: Avoid uninstantiated internal classes” warning from Code Analysis.

So the fixed version will look like:

public static class MyClassTest
    public static void TestMyProperty()
        var myClass = new MyClass(1);
        Assert.Equal(1, myClass.MyProperty);


Now, those of you who have used MSTest and Code Analysis might notice two things here:

  1. xUnit.net is not bound by the same restrictions as MSTest. Anything public with a [Fact] attribute will be tested.
  2. Why does CA1822 not occur when using the [TestMethod] attribute from MSTest?

To answer #2, I turned to ILSpy, and pulled apart the Code Analysis rule assembly to find the following “hack” by Microsoft. If you want to take a look, the DLL is located in: %ProgramFiles(x86)%\Microsoft Visual Studio 12.0\Team Tools\Static Analysis Tools\FxCop\Rules\PerformanceRules.dll

There is a class for each rule, the one I was interested in was called MarkMembersAsStatic with the hack located in this method:

private static bool IsVSUnitTestMethod(Method method)
    if (method.get_IsStatic() || 
        method.get_Parameters().get_Count() > 0 || 
        method.get_ReturnType() != FrameworkTypes.get_Void())
        return false;
    for (int i = 0; i < method.get_Attributes().get_Count(); i++)
        AttributeNode attributeNode = method.get_Attributes().get_Item(i);
        if (attributeNode.get_Type().get_Name().get_Name() == 
            "TestInitializeAttribute" || 
            attributeNode.get_Type().get_Name().get_Name() == 
            "TestMethodAttribute" || 
            attributeNode.get_Type().get_Name().get_Name() == 
            return true;
    return false;

Remember, this is disassembled code and has been re-formatted by me.

That’s right, there is an explicit suppression of the CA1822 rule against any method with an attribute called [TestInitialize], [TestMethod] or [TestCleanup]. Well, that explains that little mystery.


So far, I’ve not had any problems with xUnit.net, now I figured this out, and at some point in the future I may post more about my journey through TDD with it.

Scott Hanselman has just posted on his blog about how easy it is to get a Windows 8/8.1 Live Tile for your own site. At the bottom he posted a link to the Custom Windows Pinned Tiles plugin for WordPress by Nick Hasley.

I thought I’d give it a go, and after a few minutes of clicking, and a little hacking in Paint.net to make an icon, I now have a live tile that displays updates of from my RSS feed:


This really was easy, a big thanks to Nick for this awesome plugin.

In today’s post I’m going to go through the process for setting up the TFS Hands on Labs (HOLs) in Windows Azure.

The TFS Hands on Labs are great resources for learning the latest features in TFS. The latest TFS 2013 labs can be found here on Brian Keller’s blog.

I choose to use Windows Azure so that I can share the labs with my colleagues. I have an MSDN Subscription through my employment that gives me $150 (£130) of Azure credit a month, so I don’t have to worry too much about paying for all this.


My initial attempts to upload the VHD into Azure fell flat when I downloaded the 13GB RAR archive, unpacked it to a whopping 55GB VHD and then tried to upload to Windows Azure. My poor 2MB cable internet just wouldn’t cut it and the estimate was at least a week to upload, as well as that I was been severely throttled by my ISP.

So I came up with a cunning workaround to save my bandwidth:

  • Setup a “Downloaded” VM on Windows Azure.
  • Download, unpack and re-upload the VHD on the “Downloader” VM in Windows Azure.
  • Create a VM for the Hands on Lab and mount the VHD.


To follow this guide you will need a Windows Azure account and a basic understanding of Consoles (Command Prompt / PowerShell) and know a little bit about setting up Windows Servers.

1. Creating a “Downloader” VM

The first thing you need to do is create a “Downloader” Virtual Machine on Windows Azure to perform the download, unpack and re-upload process. Once you have completed this process and you have your Labs up and running you can delete this “Downloader” VM to save your money.


  • “Quick Create” a new Medium spec. VM. Select a data centre in the region closest to you.


  • Windows Azure will have a think for a couple of minutes and your new VM should appear.
  • When the status turns to “Running”, select it from your VM list and click the “>< Connect” button at the bottom of the screen.
  • This will download a remote desktop profile for your VM and you can login using the username and password you supplied.

2. Downloading, unpacking and re-uploading the VHD

These next steps will be performed on the new VM over remote desktop.

  • The first thing you need to do is download and install the Free Download Manager (FDM) tool.
  • I found it easier to download this on my local PC and then browse to it through the mapped remote desktop drives on the VM and copy it up. Internet Explorer on servers is annoyingly secured.
  • Once you have FDM installed, run it and import the list of URLs from Brian’s blog post.
  • This will take sometime to download, so now’s a good time to get a cup of tea.
  • Once the download is completed. Run the “VisualStudio2013.Preview.part01.exe” file on the server and click through the options to start the extraction. I did this on a Small VM and it took even longer than the download (that’s why I am suggesting a Medium VM).
  • Whilst it is unpacking now is a good time to get the “Downloader” VM and Windows Azure setup ready to upload the VHD.

2.1 Preparing to Upload a VHD to Azure.

  • First, grab a copy of Web Platform Installer and install it on the Server. Again, you may want to download it locally and copy it up to bypass Internet Explorer.
  • Open Web Platform Installer and install “Windows Azure PowerShell” along with any dependencies.
  • Open up PowerShell and run the following commands:
Set-ExecutionPolicy RemoteSigned
Import-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
  • This will setup PowerShell with all the Azure CmdLets.
  • Get a copy of your Publish Settings file from Azure using this PowerShell command. This opens up a browser window to download a file (again I did this locally and uploaded it).
  • Now you have your Publish Settings file you need to import them and select your subscription using the following PowerShell commands:
Import-AzurePublishSettingsFile 'C:\SubscriptionName-8-27-2013-credentials.publishsettings'
Set-AzureSubscription 'SubscriptionName'
  • Replace SubscriptionName with the name of your subscription.
  • Finally, verify the settings with the following command. You should see your subscription details listed.

2.2 Creating your Storage

Now that you have PowerShell ready, you need somewhere to upload the VHD to on Azure.

  • Head back to the Windows Azure Portal.
  • On the left hand side, select Storage and click “+ New”.


  • Click “Quick Create”. Give it a name and select the same region as the Downloader VM.
  • Once Azure has created the Storage, you need a Container.
  • Select the storage you have just created, and navigate to the “Containers” tab and click “Add” at the bottom.


  • Give the Container a name and leave the access as “Private”.
  • You now should have everything setup to upload the VHD

2.3 Uploading the VHD

  • Back on the Downloader VM, wait for the installer to complete, this may take some time.
  • Now that the installation is complete you can start the upload process.
  • To upload the VHD you need 2 pieces of information, the source and destination.
  • The source will be something like:
    • C:\Downloads\Visual Studio 2013 Preview\WMIv2\Virtual Hard Disks\TD02WS12SFx64.vhd – where C:\Downloads is where you let FDM download the files to.
  • The destination you can get from the Azure Portal and will be something like:
    • http://storagename.blob.core.windows.net/containername/TD02WS12SFx64.vhd  – where storagename and containername are the names of your storage and container.
  • In PowerShell on the Downloader VM run the following PowerShell command:
  • And supply the Destination and LocalFilePath (Source) when prompted.
  • This upload will take a few hours, so keep and eye on the progress – I left mine overnight.
  • When this is complete, you can create your Hands on Lab VM.

3. Creating the Disk and Hands on Lab VM

Now you should have everything you need to create your Hands on Lab VM.

  • Head back to the Windows Azure Portal again.
  • Go to “Virtual Machines” and select the “Disks” tab at the top.
  • Click “Create” and give your Disk a name and browse to the VHD in your storage container.


  • Once your Disk is created go back and create another Virtual Machine.
  • This time, don’t use “Quick Create”, instead select “From Gallery”


  • When you use “From Gallery” you will see an option on the left that says “My Disks”.
  • Select you Disk that you just created and click “Next”.
  • Fill in the rest of the options for creating a Large VM (a medium also might work).
  • Keep filling in the form until the end. Your VM will be created and start provisioning.
  • When this VM is and the status is “Running”, you are good to go.
  • Click on the “>< Connect” button for the new VM and you should be able to login as “Administrator”. The password is in the “Working with the Visual Studio 2013 ALM Virtual Machine” document that comes with the VM.

Final Steps

It is recommended that you secure the VM from the outside world – this Windows Azure VM is on the internet. So change the Administrator password and disable RDP access for any of the user accounts that don’t need it.


Setting up this VM was fun and a great excuse to learn the Windows Azure platform. Remember you can delete the Downloaded VM and any associated artefacts (except the TFS HOL HVD) when you have done with it.

If you have any questions or comments, please contact me and I’ll do my best.

Associated Links / Credits

Here are a few links I used to pull all this together, if you’re stuck you might find something useful there.

Today’s topic is another TFS 2012 post-upgrade “fix”.

I upgraded our TFS 2010 instance to TFS 2012 four month ago and, slowly, I have been fixing up all the things that broke. I’ve been using the TFS Web Administration page as a guide for what jobs are still having problems. For those of you who don’t know what that is and have on-premise TFS 2012 (or greater), I heartily recommend you check out Grant Holliday’s blog post on it.

The problem we were seeing was that the Job Monitoring graph was reporting about 50% of all jobs as failures. And nearly all the failures were for one job type, the “LabManager VMM Server Background Synchronization” job. This runs every few minutes and tries to keep the LabManager server up to date. The problem is that we setup LabManager on our TFS 2010 instance, and then tore down the VMM Lab Management Server without allowing TFS to de-register it. The TFS Administrator console did not offer any options to remove the server.

I posted on the MSDN Forums but sadly, the suggestions from Microsoft didn’t help.

In the end I turned to ILSpy and started disassembling the TFS Server Side Assemblies and found references to a “Registry” that contained the settings for Lab Management. This Registry turned out to be stored in the TFS database.

Before we go any further, I just want to be 100% clear that, messing around in the TFS is not supported by myself, or Microsoft, you do this of your own free will, and I will not support you if this goes wrong.

Within the Tfs_Configuration database there is a table called tbl_RegistryItems. This contains the configuration for Lab Management. Because we do not use Lab Management in any shape of form at the moment, I was happy to delete all our settings relating to it. If you do use Lab Management, then I don’t suggest you try this.

After backing up all the data that I was about to delete, I ran the following SQL Script to delete all our Lab Management configuration:

use Tfs_Configuration;

delete tbl_RegistryItems
where ParentPath = '#\Configuration\Application\LabManagementSettings\';

The data in this table was cached, so I needed to restart my TFS Services to get it to pick it up, but once that was done. My Lab Manager job no longer reports an error and my Job Monitoring pie chart is nearly 100% green now.

Since upgrading to TFS 2012.2 (from Update 1) I have been seeing the following error in a couple of places:

Server was unable to process request. ---> 
There was an error generating the XML document. ---> 
TF20507: The string argument contains a character that is not valid:'u8203'. 
Correct the argument, and then try the operation again.

It first appeared when loading VS2012.2 with TFS Sidekicks 4.5 installed. It also appeared when I called IIdentityManagementService.ReadIdentities() via the TFS 2012 Object Model. I guess this is what TFS Sidekicks is calling under the covers.

It turned out that this was caused by the Zero Width Space Unicode character (\u200b) been present in a Team’s Description field. Some of our users had copied and pasted into there and brought it along from somewhere, where it came from I’ll probably never know.

To track down which Team had this problem was quite a trek. Firstly, this doesn’t happen using the V10 (TFS 2010) Object Model, only the V11 Team Foundation client assembly references.

Using the following code in my TFS Template (2012 Version) as a base:

var managementService = 
var members =

I then call this method to get the crash:

var nodeMembers = 

The problem is, in my TFS Instance, there are 634 “members” and I had no idea which one might be causing the problem:

Replacing the above line with a simple loop and try…catch block reduced that for me:

foreach (var member in members)
        var nodeMembers = 
                new [] { member, }, 
    catch (Exception e)

I now had the Identity of the member with an issue, the problem I now faced was getting the name of the member.

A typical Identity looks like this:


In the end I fired up SQL Profiler, pointed it at TFS and re-ran my code snippet. I then stopped the trace and searched for the first part my Identity: 1441374244. This led me to the following SQL statement:

declare @p3 dbo.typ_KeyValuePairInt32StringVarcharTable;
insert into @p3 values(0,'S-1-9-1441374244-2649122007-3436464326-2922169763-974421344-0-0-0-0-3');

exec prc_ReadGroups 

By copying this into SQL Management Studio and running it against the Tfs_Configuration database (in a begin tran…rollback tran block, of course) I was able to get details of the Identity, in my case it was a Team.

I viewed the results of this Query as “Text” from SQL Management Studio and pasted them into a blank UTF-8 document in Notepad++. Flicking to the trusty Hex Editor plugin I could soon see some characters that did not belong in the Description text. These were the u200b Zero Width Spaces.

To fix this I just opened up my Team Web Access, navigated to the Team’s page and deleted the Description, re-typed it and saved. Re-running my Linqpad sample proved the issue was solved.

I’ve logged this as a Connect Bug as I am sure it is not intentional and it only started happening since installing in Update 2.

Posted in TFS.

Updated 03-Jun-2013: Changed Dropbox to SkyDrive after Outdoor Navigation changed.

This post is going to cover a combination of two the things I love, Linqpad and Geocaching.

If you’re not familiar with Geocaching, but have any love for the outdoors, then I heartily recommend you try it. It’s pretty much treasure hunting for grown-ups and using grown-up toys. It’s also great for the family, I never go without my son with me.

If you’re not familiar with Linqpad, go get acquainted, I use it for so many different roles, in this case, scripting C#.

As I am only an amateur Geocacher, my GPS device of choice is my Windows Phone (currently a Lumia 920 – which I also love) and GPS Tuner’s Outdoor Navigation. Outdoor Navigation is ace, it has many features making a great companion when out and about. However, it does not have full integration with Geocaching.com, so you have to use SkyDrive to synchronise a “loc” file downloaded from Geocaching.com.

This brings us to the problem, each loc file is one Point of Interest in Outdoor Navigation, meaning that if I plan to stock up on (say five) caches before heading out I need to download five files, copy five files to SkyDrive, fire up Outdoor Navigation and import each file, one after another from list. Downloading the files is easy enough, my browser only requires one click. Uploading to SkyDrive is equally easy, just multi-select and copy. But the import into Outdoor Navigation is not multi-select or fluid to work through the a long list. The import process was starting to get tedious after a while so I decided to dig into the loc files to see if there was a way to speed things up.

Here’s a loc file for just one Cache (from today’s outing):

<?xml version="1.0" encoding="UTF-8"?>
<loc version="1.0" src="Groundspeak">
        <name id="GC37092">
            <![CDATA[A Longer Dog Walk #1 by The Briar Rose]]>
        <coord lat="53.602717" lon="-1.79085"/>
        <link text="Cache Details">http://www.geocaching.com/seek/cache_details.aspx?wp=GC37092</link>

It’s an XML document with a “loc” root element and a “waypoint” for the Point of Interest. After a bit of experimentation I found I could put any number of “waypoint” elements into the document and that Outdoor Navigation would only have to import one file that contained any number of Points of Interest. Bingo!

Now I had a way to import just one file if, but to do that I needed to combine all the individual loc files from Geocaching.com. The next issue was how to combine them, enter Linqpad, as always.

Using a bit of Linq-Xml I was able to load each file, grab the “waypoint” element(s) and then combine them into a new XML document that I could save to disk and even auto upload to SkyDrive – providing I used the SkyDrive Application for Windows.

First. Load all the loc files I have downloaded.

var locationElements = 
  .EnumerateFiles(folderPath, "*.loc")
  .SelectMany(path => new LocFile(path).GetElements());

There’s a bit going on here. We start by grabbing any file with the “loc” extension in the folder stored in the folderPath variable – “Downloads\GC”  in my case – Creating a new instance of the LocFile class calling GetElements() and combining all the results together. I’m using SelectMany instead of Select because I am assuming there could be more than one “waypoint” element in a loc file. What we are left with is an IEnumerable<XElement> containing all the “waypoint” elements from all the files.

Second. Creating the new XDocument:

var combinedLocFiles = 
  new XDocument(
    new XElement("loc",
        new XAttribute("version", "1.0"),
        new XAttribute("src", "Groundspeak"),

Here we are creating a new XDocument with a root element of “loc” with the attributes “version” and “src” with the values of “1.0″ and “Groundspeak”, respectively (the first, second and last lines in the above XML). And then filling the root element with the contents of the locationElements – the “waypoint” elements from the First snippet.

Finally. The LocFile class.

class LocFile
    readonly String _filePath;
    public LocFile(String filePath)
        _filePath = filePath;
    public ReadOnlyCollection<XElement> GetElements()

This is the class that represents a loc file and knows how to get the relevant Elements from it.

These three snippets are the meat of the script, everything else is just fluff to get the correct paths and save the output and copy it to SkyDrive.

It’s pretty easy to use, just fire up Linqpad, load the Script and then run it. Before I start I ensure that there are only the loc files I want to combine in my “folderPath”, but other than that, you’re away.

If you need to change the paths, that’s at the top of the script and is pretty straight forward too.


Here’s the link to download the “linq” file from my SkyDrive.