Working in an “Enterprise” type environment means lots of fun obstacles getting in the way of your day to day work – the corporate proxy is one of my challenges.

Since giving up on CNTLM Proxy, I haven’t been able to connect to nuget.org from the package manager, view the Visual Studio Extension Gallery or even get any extension/product updates from Visual Studio.

This is a quick post with the changes I needed to get Visual Studio 2013 Update 3, NuGet 2.8 and Web Platform (Web PI) 5 to see past the corporate Squid proxy.

NuGet

Configuring NuGet based on this Stack Overflow answer by arcain.

Running nuget.exe with the following switches will allow NuGet to use and authenticate with the proxy:

nuget.exe config -set http_proxy=http://proxy-server:3128
nuget.exe config -set http_proxy.user=DOMAIN\Dave
nuget.exe config -set http_proxy.password=MyPassword

It will put the values into your nuget.config file (with the password encrypted).

<configuration>
    <config>
        <add key="http_proxy" value="http://proxy-server:3128" />
        <add key="http_proxy.user" value="DOMAIN\Dave " />
        <add key="http_proxy.password" value="base64encodedHopefullyEncryptedPassword" />
    </config>
</configuration>

Once Visual Studio is restarted, it should be able to see through the proxy.

As per the comments on the answer some people might have success without the password – sadly, not in my case. Also, remember if you have to change your password (as I have to every month or so) you will need to re-enter your password.

 

Visual Studio

Setting up Visual Studio based on this blog post by Raffael Herrmann.

  1. Open the devenv.exe.config file. I find it by right clicking the Visual Studio shortcut, selecting Properties and then “Open File Location”. If you have UAC enabled you will need to open it in a program running as Administrator.
  2. Scroll to the end of the file and find the system.net section:
    <!-- More -->
    </system.data>
    <system.net>
        <settings>
            <ipv6 enabled="true"/>
        </settings>
    </system.net>
    <appSettings>
    <!-- More -->
    
  3. Add the following below </settings>:
    <defaultProxy useDefaultCredentials="true" enabled="true">
        <proxy bypassonlocal="true" proxyaddress="http://proxy-server:3128" />
    </defaultProxy>
    
  4. The final version will look something like this:
    <system.net>
        <settings>
            <ipv6 enabled="true"/>
        </settings>
        <defaultProxy useDefaultCredentials="true" enabled="true">
            <proxy bypassonlocal="true" proxyaddress="http://proxy-server:3128" />
        </defaultProxy>
    </system.net>
    

Web Platform Installer

This was the same set of changes needed for Visual Studio, except with the WebPlatformInstaller.exe.config file, which I again obtained from the shortcut properties using “Open File Location”.

Thanks

Big thanks to Eric Cain and Raffael Herrmann for enabling me to connect to the internet again Smile.

FacebookTwitterGoogle+Share

Yesterday, at work, I was trying to help someone enter a UNC path (\\server\share\file.txt) into a hyperlink control on our application, but it was rejecting it because it wasn’t a valid hyperlink. I discovered that you could enter a URI path (file://server/share/file.txt) and it worked fine. Problem solved? Not exactly, otherwise I wouldn’t have anything to write about.

The issue was that the users are non-technical folk and they can’t just fire up Linqpad and run the following code like I did to test if it worked:

new Uri(@"\\server\share\file.txt").AbsoluteUri 

The rules around converting UNC paths to URI’s gets a little tricky when you have paths with spaces and special characters in them, so I thought I would Google/Bing for an online UNC to URI Path Converter… turns out I couldn’t find one, so I did what every software developer does, writes one.

 

Screenshot

This took a little over an evening, mostly due to me been unfamiliar with CSS, JQuery (I know, but it’s quicker than Vanilla JS) and TypeScript (first time I’ve ever used it).

The entire website was setup in my VS Online account and linked to my Azure site as part of the new “one ASP.NET” setup template, and I even got to add Application Insights as part of the setup template. A few clicks on the Azure Portal and I had a Continuous Delivery Build Definition setup in VS Online. All I had to do then was push changes from my local git repository to VS Online and TFS would build the code and if it succeeded the site was updated within a few minutes.

The site works by making a quick AJAX HTTP GET when you click the Convert button to a ASP.NET MVC site to use the Uri class in .NET. That’s about it.

Here’s the link to anyone who want’s to use it: http://pathconverter.azurewebsites.net/

I’ve been keeping an eye on the development of Roslyn ever since Anders Hejlsberg announced Microsoft were open sourcing it at Build 2014. For those of you who don’t know what Roslyn is, it is Microsoft’s new C# and VB.NET compiler platform. There are many aspects to it, and if you want a nice overview Schabse Laks has a couple of blog posts on the subject.

Most developers seem to be excited by the new C# 6.0 language and the future of the language going forward – I can see why, C# hasn’t really changed that much recently. Whilst I am interested in the language, another thing that has got me really excited as a C# developer is the Language Service.

Language Service

The language service is what Visual Studio uses to talk to the compiler to understand your code. Previously this was implemented as a series of “hacks” between Visual Studio and the compilers. This made it hard for the Visual Studio team to implement many decent refactoring experiences. The last time anything new really appeared for the “coders” was in VS2010 when the existing refactoring tools were released. Two full versions of VS later, and we still have the same refactoring features. Whilst the C# team have been busy working on Roslyn, products like Jet Brains ReSharper have come along with loads of features of their own, adding refactoring tools and code analysis. But I think times are changing…

For those of you who have installed “Visual Studio 14” or the Roslyn End User Preview, you are seeing the beginning, of what I think, will be a revolution to Visual Studio’s coding experience for C# (as well as VB.NET).

New Features

Here’s a quick overview of some of the features added already:

Visualising using directives that are not needed.

Remove Usings

Here you can see that I don’t need three of my five using directives so they are coloured grey.

Ctrl + . to Resolve

Taking the previous “issue” with my code I can just place the caret on any of the lines with an issue and press Ctrl + . (full stop) to fix it.

Remove Usings Action

You even get a preview of what your class will look like when “fixed”.

Ctrl + . has been used to Resolve issues with your code before (e.g. add using directives), but now it has previews and can suggest fixes to new problems. For example, it could suggest not to use Hungarian Notation and remove the prefix, this would be a lot better experience than the existing Code Analysis CA1704 rule.

Alt + . and Refactoring

Since VS2010, the basic refactoring was done via a series of keyboard combination shortcuts. Ctrl + R, M to Extract Method is the only one I remember, the others I know are there but I can’t remember the keyboard shortcuts. Now everything is under Alt + . and is context sensitive.

So if I select a few statements in a block of code and do Alt + . get the option to Extract Method, select a field and do Alt + . I get Encapsulate Field, select the class name, Extract Interface, you get the idea.

The interface for renaming has changed too to include live preview and errors.

New Method

Above you can see a New Method I’ve extracted and the Live Preview of renaming. If I decided to call “NewMethod” “Foo” I’d get this error because I already have method called Foo.

New Method Error

Code Lens

Code Lens is a heads up display that is already available in VS2013 Ultimate and is powered by Roslyn to provide code and source control information inline in the editor. I’ve only played with this during the CTP of VS2013, but it seemed really promising – I’m now waiting on Microsoft to move it from the Ultimate SKU to Premium.

Visual Studio 14

All the aforementioned features are in VS2013 + End User Preview, Roslyn will actually ship enabled in the next Visual Studio code named “Visual Studio 14”. This has an early preview available and already more refactoring features have appeared. For example, Soma mentions a new Introduce Local refactoring in his blog post.

This sort of stuff isn’t really ground breaking from a developer experience point of view and it’s nothing new for people who use Visual Studio add-ins like ReSharper. But, it is proof that Visual Studio’s Editor is back in the game.

My opinion is that from now until VS 14 RTM ships, the C# and VS team will be focusing their efforts on stabilising the Compiler aspect of Roslyn and making sure everything “just works”, we may see a few more refactoring tools appear in the next update, but I’m not expecting much.

After the RTM

I’m looking forward to when Visual Studio 14 has hit RTM has shipped, what happens next…? The Visual Studio Editor team have time to take a good hard look at what they want to build, the C# compiler team can go back to working on language features. I expect the VS Editor team will be working on some awesome editor experiences, some we might have seen before in other products, others brand new. I cannot see them resting on their laurels once they have the Language Services as a platform to build on, they can do anything they want.

Community

The other thing that really excites me is the community aspect, building your own VS Editor extensions that leverage the Language Services from the compiler is now easy. What ideas will the community build, what would you build? I’m hoping that individual extensions will appear first, and then grow into suites of common functionality, and maybe even be implemented by the VS Editor team in the future.

It doesn’t just have to be extensions, it is also really easy to write your own diagnostics tools. Already there is a String.Format Diagnostic package on the VS Gallery. So, where does this leave products like ReSharper? In my opinion, Microsoft always implement the features most people want, there will still be room for people to create their own extensions suites and JetBrains can charge for a professional package. However, now it will be a lot easier for people to build their own, no longer do you need to write a C# Code Analyser, so there could be some awesome free open source extension suites appearing too.

Summary

I think the next few updates for Visual Studio will brings lots of exciting and powerful enhancements to the Visual Studio Editor for both the C# and VB.NET developers. It’s been a long time coming, but I think Visual Studio is back on the front font and ready to innovate some more.

I’m a premium NewsBlur member, and have been ever Google decided to shutdown Google Reader. Mostly my experiences have been very good, with great support from Samuel when I needed it.

However, there has been one issue nagging at me for quite a while and this is, I cannot get a feed to Mark Seemann’s blog. I posted the problem on Get Satisfaction, but Samuel was unable to help due to the size of the feed. A few weeks ago a co-worker of mine mentioned Yahoo Pipes should be able to sort this, so I finally gave it a try. For those (like me until recently) who don’t know what Yahoo Pipes is it’s “a powerful composition tool to aggregate, manipulate, and mashup content from around the web”.

After a few minutes tinkering, I had finally built a “pipe” that took the blog feed, performed a “truncate” operation to 10 posts and output the feed again. I then took the RSS feed to the pipe and handed it to NewsBlur and this time it coped perfectly with the feed.

Pipes

I’m sure there’s more I can do with it, but for now that’s all I need.

So, if anyone else needs a “Last 10 Feeds from Ploeh.dk blog” you can get the link here.

When you work on projects of a certain size, you will find that you begin to add custom fields to your TFS Work Items that you later want to search on. In my case it was an External Requirement ID field that we use to store the ID of requirements that our customers use to track requirements. I often use these when communicating with our customers, or even members of my team.

For example:

  “Have you checked in A1 yet?” is easier to ask and understand than “Have you checked in 177484?”.

The problem that arises with this approach, is been able to find a work item by its External Requirement ID.

To solve this issue, I once again turned to Linqpad and came up with a script that lets you search by a comma separated list entries against your work items. After a bit of digging I managed to find the correct API to be able to produce reliable TFS Web Access URL’s in the results:

Results

To use the script, just place you collection address and custom field name at the top. You can also add any other filtering into WIQL, for example, you might only want to search a certain Area Path.

When you run the script you will be asked to “Enter External Requirement ID”, just enter the ID i.e. A1 or a list of ID’s i.e. A1, A2 and press Enter.

I keep mine “Pinned” to the Jump List of Linqpad on my taskbar for ease of access.

You can download the script from here:

The script is based on my Linqpad TFS Template, if you need it for any other version of Visual Studio, download the correct template and copy and paste the body of the script between them.

Visual Studio 2013 has a re-designed Team Explorer, and I really like it, but I sometimes find some options, such as “Find Shelvesets” and “New Query”, to be buried away.

For example, the normal path I have to take to find a shelveset is:

  • Pending Changes
  • Actions
  • Find Shelvesets

This is really “clicky” and something I wished I could do quicker.

Today I found out I could right-click on the tiles on the Team Explorer Home Page to get to some of those actions that are buried away:

Pending Changes

From Pending Changes you can get to Find Shelvesets

PendingChanges

Work Items

From Work Items you can get to “New Query”

WorkItems

My Work

From My Work you can get to Request Review

MyWork

Team Members

If you have the TFS 2013 Power Tools installed, then even the Team Members gets the right-click treatment with access to Team Utilities

TeamUtils

I recently started working on a new project and decided to use xUnit.net as my unit testing framework. I made a simple test class with a test method, but when I build it, Visual Studio Code Analysis complained with a “CA1822: Mark members as static” warning against my test method.

Let’s assume I have a test for MyClass that looks like this:

public class MyClassTest
{
    [Fact]
    public void TestMyProperty()
    {
        var myClass = new MyClass(1);
        Assert.Equal(1, myClass.MyProperty);
    }
}

 

In this version, Code Analysis is complaining because TestMyProperty does not uses any instance members from MyClassTest. This is all well and good in production code, but I assumed (incorrectly) that you must have instance test methods for the test framework to pick them up. As it turns out, xUnit.net is better than that and works on static methods as well as instance methods.

If these were all the tests I needed, I should mark the method as static. If all your test methods are static, you should mark the class as static too – otherwise you will get a “CA1812: Avoid uninstantiated internal classes” warning from Code Analysis.

So the fixed version will look like:

public static class MyClassTest
{
    [Fact]
    public static void TestMyProperty()
    {
        var myClass = new MyClass(1);
        Assert.Equal(1, myClass.MyProperty);
    }
}

 

Now, those of you who have used MSTest and Code Analysis might notice two things here:

  1. xUnit.net is not bound by the same restrictions as MSTest. Anything public with a [Fact] attribute will be tested.
  2. Why does CA1822 not occur when using the [TestMethod] attribute from MSTest?

To answer #2, I turned to ILSpy, and pulled apart the Code Analysis rule assembly to find the following “hack” by Microsoft. If you want to take a look, the DLL is located in: %ProgramFiles(x86)%\Microsoft Visual Studio 12.0\Team Tools\Static Analysis Tools\FxCop\Rules\PerformanceRules.dll

There is a class for each rule, the one I was interested in was called MarkMembersAsStatic with the hack located in this method:

private static bool IsVSUnitTestMethod(Method method)
{
    if (method.get_IsStatic() || 
        method.get_Parameters().get_Count() > 0 || 
        method.get_ReturnType() != FrameworkTypes.get_Void())
    {
        return false;
    }
    for (int i = 0; i < method.get_Attributes().get_Count(); i++)
    {
        AttributeNode attributeNode = method.get_Attributes().get_Item(i);
        if (attributeNode.get_Type().get_Name().get_Name() == 
            "TestInitializeAttribute" || 
            attributeNode.get_Type().get_Name().get_Name() == 
            "TestMethodAttribute" || 
            attributeNode.get_Type().get_Name().get_Name() == 
            "TestCleanupAttribute")
        {
            return true;
        }
    }
    return false;
}

Remember, this is disassembled code and has been re-formatted by me.

That’s right, there is an explicit suppression of the CA1822 rule against any method with an attribute called [TestInitialize], [TestMethod] or [TestCleanup]. Well, that explains that little mystery.

 

So far, I’ve not had any problems with xUnit.net, now I figured this out, and at some point in the future I may post more about my journey through TDD with it.

Scott Hanselman has just posted on his blog about how easy it is to get a Windows 8/8.1 Live Tile for your own site. At the bottom he posted a link to the Custom Windows Pinned Tiles plugin for WordPress by Nick Hasley.

I thought I’d give it a go, and after a few minutes of clicking, and a little hacking in Paint.net to make an icon, I now have a live tile that displays updates of from my RSS feed:

LiveTile

This really was easy, a big thanks to Nick for this awesome plugin.

In today’s post I’m going to go through the process for setting up the TFS Hands on Labs (HOLs) in Windows Azure.

The TFS Hands on Labs are great resources for learning the latest features in TFS. The latest TFS 2013 labs can be found here on Brian Keller’s blog.

I choose to use Windows Azure so that I can share the labs with my colleagues. I have an MSDN Subscription through my employment that gives me $150 (£130) of Azure credit a month, so I don’t have to worry too much about paying for all this.

Overview

My initial attempts to upload the VHD into Azure fell flat when I downloaded the 13GB RAR archive, unpacked it to a whopping 55GB VHD and then tried to upload to Windows Azure. My poor 2MB cable internet just wouldn’t cut it and the estimate was at least a week to upload, as well as that I was been severely throttled by my ISP.

So I came up with a cunning workaround to save my bandwidth:

  • Setup a “Downloaded” VM on Windows Azure.
  • Download, unpack and re-upload the VHD on the “Downloader” VM in Windows Azure.
  • Create a VM for the Hands on Lab and mount the VHD.

Prerequisites

To follow this guide you will need a Windows Azure account and a basic understanding of Consoles (Command Prompt / PowerShell) and know a little bit about setting up Windows Servers.

1. Creating a “Downloader” VM

The first thing you need to do is create a “Downloader” Virtual Machine on Windows Azure to perform the download, unpack and re-upload process. Once you have completed this process and you have your Labs up and running you can delete this “Downloader” VM to save your money.

TFSHOL1

  • “Quick Create” a new Medium spec. VM. Select a data centre in the region closest to you.

TFSHOL2

  • Windows Azure will have a think for a couple of minutes and your new VM should appear.
  • When the status turns to “Running”, select it from your VM list and click the “>< Connect” button at the bottom of the screen.
  • This will download a remote desktop profile for your VM and you can login using the username and password you supplied.

2. Downloading, unpacking and re-uploading the VHD

These next steps will be performed on the new VM over remote desktop.

  • The first thing you need to do is download and install the Free Download Manager (FDM) tool.
  • I found it easier to download this on my local PC and then browse to it through the mapped remote desktop drives on the VM and copy it up. Internet Explorer on servers is annoyingly secured.
  • Once you have FDM installed, run it and import the list of URLs from Brian’s blog post.
  • This will take sometime to download, so now’s a good time to get a cup of tea.
  • Once the download is completed. Run the “VisualStudio2013.Preview.part01.exe” file on the server and click through the options to start the extraction. I did this on a Small VM and it took even longer than the download (that’s why I am suggesting a Medium VM).
  • Whilst it is unpacking now is a good time to get the “Downloader” VM and Windows Azure setup ready to upload the VHD.

2.1 Preparing to Upload a VHD to Azure.

  • First, grab a copy of Web Platform Installer and install it on the Server. Again, you may want to download it locally and copy it up to bypass Internet Explorer.
  • Open Web Platform Installer and install “Windows Azure PowerShell” along with any dependencies.
  • Open up PowerShell and run the following commands:
Set-ExecutionPolicy RemoteSigned
Import-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
  • This will setup PowerShell with all the Azure CmdLets.
  • Get a copy of your Publish Settings file from Azure using this PowerShell command. This opens up a browser window to download a file (again I did this locally and uploaded it).
Get-AzurePublishSettingsFile
  • Now you have your Publish Settings file you need to import them and select your subscription using the following PowerShell commands:
Import-AzurePublishSettingsFile 'C:\SubscriptionName-8-27-2013-credentials.publishsettings'
Set-AzureSubscription 'SubscriptionName'
  • Replace SubscriptionName with the name of your subscription.
  • Finally, verify the settings with the following command. You should see your subscription details listed.
Get-AzureSubscription

2.2 Creating your Storage

Now that you have PowerShell ready, you need somewhere to upload the VHD to on Azure.

  • Head back to the Windows Azure Portal.
  • On the left hand side, select Storage and click “+ New”.

TFSHOL3

  • Click “Quick Create”. Give it a name and select the same region as the Downloader VM.
  • Once Azure has created the Storage, you need a Container.
  • Select the storage you have just created, and navigate to the “Containers” tab and click “Add” at the bottom.

TFSHOL4

  • Give the Container a name and leave the access as “Private”.
  • You now should have everything setup to upload the VHD

2.3 Uploading the VHD

  • Back on the Downloader VM, wait for the installer to complete, this may take some time.
  • Now that the installation is complete you can start the upload process.
  • To upload the VHD you need 2 pieces of information, the source and destination.
  • The source will be something like:
    • C:\Downloads\Visual Studio 2013 Preview\WMIv2\Virtual Hard Disks\TD02WS12SFx64.vhd – where C:\Downloads is where you let FDM download the files to.
  • The destination you can get from the Azure Portal and will be something like:
    • http://storagename.blob.core.windows.net/containername/TD02WS12SFx64.vhd  – where storagename and containername are the names of your storage and container.
  • In PowerShell on the Downloader VM run the following PowerShell command:
Add-AzureVhd
  • And supply the Destination and LocalFilePath (Source) when prompted.
  • This upload will take a few hours, so keep and eye on the progress – I left mine overnight.
  • When this is complete, you can create your Hands on Lab VM.

3. Creating the Disk and Hands on Lab VM

Now you should have everything you need to create your Hands on Lab VM.

  • Head back to the Windows Azure Portal again.
  • Go to “Virtual Machines” and select the “Disks” tab at the top.
  • Click “Create” and give your Disk a name and browse to the VHD in your storage container.

TFSHOL5

  • Once your Disk is created go back and create another Virtual Machine.
  • This time, don’t use “Quick Create”, instead select “From Gallery”

TFSHOL6

  • When you use “From Gallery” you will see an option on the left that says “My Disks”.
  • Select you Disk that you just created and click “Next”.
  • Fill in the rest of the options for creating a Large VM (a medium also might work).
  • Keep filling in the form until the end. Your VM will be created and start provisioning.
  • When this VM is and the status is “Running”, you are good to go.
  • Click on the “>< Connect” button for the new VM and you should be able to login as “Administrator”. The password is in the “Working with the Visual Studio 2013 ALM Virtual Machine” document that comes with the VM.

Final Steps

It is recommended that you secure the VM from the outside world – this Windows Azure VM is on the internet. So change the Administrator password and disable RDP access for any of the user accounts that don’t need it.

Conclusion

Setting up this VM was fun and a great excuse to learn the Windows Azure platform. Remember you can delete the Downloaded VM and any associated artefacts (except the TFS HOL HVD) when you have done with it.

If you have any questions or comments, please contact me and I’ll do my best.

Associated Links / Credits

Here are a few links I used to pull all this together, if you’re stuck you might find something useful there.

Today’s topic is another TFS 2012 post-upgrade “fix”.

I upgraded our TFS 2010 instance to TFS 2012 four month ago and, slowly, I have been fixing up all the things that broke. I’ve been using the TFS Web Administration page as a guide for what jobs are still having problems. For those of you who don’t know what that is and have on-premise TFS 2012 (or greater), I heartily recommend you check out Grant Holliday’s blog post on it.

The problem we were seeing was that the Job Monitoring graph was reporting about 50% of all jobs as failures. And nearly all the failures were for one job type, the “LabManager VMM Server Background Synchronization” job. This runs every few minutes and tries to keep the LabManager server up to date. The problem is that we setup LabManager on our TFS 2010 instance, and then tore down the VMM Lab Management Server without allowing TFS to de-register it. The TFS Administrator console did not offer any options to remove the server.

I posted on the MSDN Forums but sadly, the suggestions from Microsoft didn’t help.

In the end I turned to ILSpy and started disassembling the TFS Server Side Assemblies and found references to a “Registry” that contained the settings for Lab Management. This Registry turned out to be stored in the TFS database.

Before we go any further, I just want to be 100% clear that, messing around in the TFS database is not supported by myself or Microsoft, you do this of your own free will, and I will not support you if this goes wrong.

Within the Tfs_Configuration database there is a table called tbl_RegistryItems. This contains the configuration for Lab Management. Because we do not use Lab Management in any shape of form at the moment, I was happy to delete all our settings relating to it. If you do use Lab Management, then I don’t suggest you try this.

After backing up all the data that I was about to delete, I ran the following SQL Script to delete all our Lab Management configuration:

use Tfs_Configuration;

delete tbl_RegistryItems
where ParentPath = '#\Configuration\Application\LabManagementSettings\';

The data in this table was cached, so I needed to restart my TFS Services to get it to pick it up, but once that was done. My Lab Manager job no longer reports an error and my Job Monitoring pie chart is nearly 100% green now.