When .net assemblies aren’t properly versioned odd errors happen

Today I’ve upgraded a SignalR library from 1.x to 2.0 in an MVC 5 (server) and WPF (client) projects I am developing. The upgrade itself went smooth but afterwards the client started yielding odd errors and refusing to connect anymore. It was yielding an exception saying something like Transport Connection time out (I can’t remember the exact message). Since it was a rather simple upgrade this error made no sense. Google didn’t help much either.

So I went debugging.

Investigation

First step is to create a simple repro sample – a solution with a basic server and a console client (see attached zip).

Second step is to turn on SignalR tracing on client:

hubConnection.TraceLevel = TraceLevels.All;
hubConnection.TraceWriter = Console.Out;

Armed with these two steps I was able to get an error that made a bit more sense although it was even more bizzare.

18:43:53.9073548 - null - ChangeState(Disconnected, Connecting)
18:43:55.4913444 - b8fd1874-483d-4d89-8a69-c2cfe33cf946 - WS Connecting to: ws://localhost:53705/signalr/connect?transport=webSockets&connectionToken=0dQ3scw2aJ7RbXuUsTfa4wlY%2FJlMzW%2FVL1sVb%2FyswewEO8n4qAth7Erpt0ga0laLV%2B8BCD833lZZy1MblOe0HuQs0O1mYi%2FfiiLSHc5%2F%2B1wk6tUsbFxMKVQFwRO9BvSW&connectionData=[{"Name":"formsHub"}] 18:43:55.5800651 - b8fd1874-483d-4d89-8a69-c2cfe33cf946 - WS: OnMessage({"C":"d-5AB67EA2-B,0|C,0|D,1|E,0","S":1,"M":[]})
18:43:55.5940751 - b8fd1874-483d-4d89-8a69-c2cfe33cf946 - OnError(System.MissingMethodException: Method not found: 'System.Collections.Generic.IEnumerator`1 Newtonsoft.Json.Linq.JArray.GetEnumerator()'. at Microsoft.AspNet.SignalR.Client.Transports.TransportHelper.ProcessResponse(IConnection connection, String response, Boolean& shouldReconnect, Boolean& disconnected, Action onInitialized) at Microsoft.AspNet.SignalR.Client.Transports.WebSocketTransport.OnMessage(String message) at Microsoft.AspNet.SignalR.WebSockets.WebSocketHandler.d__e.MoveNext())
18:43:55.6110864 - b8fd1874-483d-4d89-8a69-c2cfe33cf946 - WS: OnClose()

It says something like – I can’t find a method in the Newtonsoft.Json assembly. That’s totally weird as this kind of error should be handled at compile time.

I went looking into Newtonsoft.Json assembly referenced by SignalR 2.0. It correctly references version 5.0.8. The assembly in package folder has the method it was mentioned as missing. Mystery. One thing that caught my eye  while .net reflecting the assembly was its version. While the file version is 5.0.8. the assembly version is, oddly, 4.5.0.0. How come?

Fourth step was looking into modules loaded by my application. They are listed through Visual Studio-Debug/Windows/Modules. Amazingly, Visual Studio was listing Newtonsoft.Json v4.05.8.15203 from … GAC. Not from my package. And by looking at its date it was clear that it was an older version (4.8.2012). Heck, I didn’t even know it was in GAC. I didn’t put it, some other app had to.

 

From previous Googling I remembered an explanation somewhere that the missing GetEnumerator method was introduced in version 5.0.5. So this was probably it – .net was loading an older version from GAC - that was really missing that method. At this point I knew what was wrong.

Just in case, to make sure, I checked whether the assembly really was in GAC:

C:\Windows\system32>gacutil /l newtonsoft.json
Microsoft (R) .NET Global Assembly Cache Utility.  Version 4.0.30319.18020
Copyright (c) Microsoft Corporation.  All rights reserved.

The Global Assembly Cache contains the following assemblies:
  newtonsoft.json, Version=4.5.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed, processorArchi
tecture=MSIL

Number of items = 1

It was.

Remedy

The proper way would be to have a properly versioned Newtonsoft.Json assembly (those are strong key signed as well). That’s why there is versioning and .net wouldn’t make this confusion. I really wonder why James uses the same assembly version for assemblies that should really have different signatures. Hopefully he will fix this problem in next versions. But until then there is …

… the next best way is to force .net to not load it from GAC.  If the assembly is removed from GAC, .net won’t find the wrong version anymore. See, .net always tries GAC first and only if it doesn’t find a suitable assembly it will go looking elsewhere, like into the folder where your application is. Perhaps I could configure it to load the proper assembly through .config file but this solution is better – it’ll solve the problem unless an older newtonsoft.json is reinstalled again in GAC. Hm.

Removing assembly from GAC should be a rather simple process. Just fire

GACUTIL /U Newtonsoft.Json

and it should be removed. But not this time. Instead I got this warning:

C:\Windows\system32>gacutil /u newtonsoft.json
Microsoft (R) .NET Global Assembly Cache Utility.  Version 4.0.30319.18020
Copyright (c) Microsoft Corporation.  All rights reserved.


Assembly: newtonsoft.json, Version=4.5.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed, proces
sorArchitecture=MSIL
Unable to uninstall: assembly is required by one or more applications
Pending references:
              SCHEME: <WINDOWS_INSTALLER>  ID: <MSI>  DESCRIPTION : <Windows Installer>
Number of assemblies uninstalled = 0
Number of failures = 0

Basically it is saying that some installed application did install it into GAC and now it is preventing me from removing it from GAC. Uf, nasty. To make it worse, it doesn’t say which application. This article lists the way to “unreference it” to be able to remove it from GAC. Basically you have to remove its entry from the registry and then execute gacutil removal command again.

 

After registry modification it succeeded.

C:\Windows\system32>gacutil /u newtonsoft.json
Microsoft (R) .NET Global Assembly Cache Utility.  Version 4.0.30319.18020
Copyright (c) Microsoft Corporation.  All rights reserved.


Assembly: newtonsoft.json, Version=4.5.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed, proces
sorArchitecture=MSIL
Uninstalled: newtonsoft.json, Version=4.5.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed, pro
cessorArchitecture=MSIL
Number of assemblies uninstalled = 1
Number of failures = 0

The aftermath

Removing that old Newtonsoft.Json from registry did the trick. SignalR works like charm again. However, I had to manually remove a component of an unidentified application. And this removal might cause problems with that application in the future.

I really hope that Newtonsoft.Json gets proper versioning to avoid situations like this. This should be a lesson to every library developer out there – do use proper version numbering!.

SignalRTest.zip (4.69 mb)

UPDATES:

12.11.2013: James, the author, is looking at the possible solution. See https://json.codeplex.com/discussions/465332#post1122165.

Android activity life cycle and IoC

I was recently working on an Android (Xamarin, .net) application based on MvvmCross framework. Actually not just an Android app since it could be ported quite easily to other platforms, such as Windows Phone 8 or IPhone. Anyway, I was using inversion of control principle in a slightly incorrect way and thus done a mistake which revealed only a while after the deployment.

Here are the symptoms: Application would cold start fine, would resume fine when resume was done within reasonable time (no idea, depends on the device, in my case could be hours and it would work perfectly) but it wouldn’t resume well if the timespan was too much (i.e. a day or several hours). Instead of displaying the activity content it was just an actionbar with title without any content of menuitems.

By looking at android’s log on the device itself it was clear that there were problems with IoC resolve method. It yielded an “Object reference null” type of exception. I was resolving the reference within activity’s constructor. Odd error, since MvvmCross is supposed to trigger IoC registration  before the first activity is run. But somehow it wasn’t. I’ve mentioned that to Stuart (@slodge, man behind MvvmCross) and he instantly pointed to the mistake I made – I neglected the Android activity’s lifecycle. I immediately understood my mistake: I am used to put IoC references within constructors as arguments, which is mostly fine. Except when it comes to Android activities. The thing is that the entry point of a suspended application (with activity destroyed) is the activity itself, more precisely, its constructor. MvvmCross does trigger the IoC registration method correcly, but, of course, only after the activity is created and hence I was having errors in this particular situation.

The solution is fairly simple – move IoC resolving within OnCreate method, not sooner – that’s the point where you can be certain that MvvmCross initialization is done.

Before:

public abstract class SomeClass<T> : Activity
{
    protected readonly IFragmentPresenter fragmentPresenter;

    public MvxActivityFragmentHost(): this(Mvx.Resolve<IFragmentPresenter>())
    {}

    public MvxActivityFragmentHost(IFragmentPresenter fragmentPresenter)
    {
        this.fragmentPresenter= fragmentPresenter;
    }
    ...
}

after

public abstract class SomeClass<T> : Activity
{
    protected IFragmentPresenter fragmentPresenter;

    protected override void OnCreate (Bundle bundle)
    {
        base.OnCreate (bundle);
        fragmentPresenter = Mvx.Resolve<IFragmentPresenter>();
    }
    ...
}

This change did the trick. A bit more work to inject the mock reference when testing but not a big deal.

Now, if you followed the post you might be asking, why the heck does application cold starts just fine, since it should get the same error. The explanation is rather simple – I am using a splashscreen activity when application cold starts. This one doesn’t have any IoC resolving involved at all, but it does initialize MvvmCross. So when it gets to the my problematic activity, the IoC is already in place.

Now, there you have it. Respect the Android activity lifecycle otherwise it will bit you.

Visual Studio 2013 brings IntelliSense for Data Binding to XAML editor

One of the better improvements in Visual Studio 2013 is IntelliSense for data binding in XAML editor. The improvement is described in this blog post. Very nice. But what article fails to mention is how does one get that mystical d namespace. In face, it is not exactly easy to find the proper declaration.

After some googling around I have found this declaration:

xmlns:d="http://schemas.microsoft.com/expression/blend/2008"

Then I tried applying a d:DataContext to the first element on the default window template – Grid, like:

<Grid d:DataContext="{d:DesignInstance Type=local:Tubo}">
    <TextBlock Text="{Binding Xul}"
</Grid>

That is supposed to work, but it doesn't (I have a simple class Tubo with a single property Xul in local namespace). Instead of compiling it was throwing this error at me:

The property 'DataContext' must be in the default namespace or in the element namespace 'http://schemas.microsoft.com/winfx/2006/xaml/presentation'.

Yeah, right. After some more googling I found that I had to add another namespace and a ignore property to the mix:

xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
mc:Ignorable="d"

These two lines did the trick, error was gone, the project compiled and IntelliSense was alive.

Here is my whole XAML:

<Window x:Class="WpfApplication73.MainWindow"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
        xmlns:local="clr-namespace:WpfApplication73"
        xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
        mc:Ignorable="d"
        Title="MainWindow" Height="350" Width="525">
    <Grid d:DataContext="{d:DesignInstance Type=local:Tubo}">
        <TextBlock Text="{Binding Xul}"
    </Grid>
</Window>

Hopefully this post will spare some time for others trying to achieve the IntelliSense…

Righthand’s DataSet Debugger Visualizer supports VS2013

Highlights from new version to 1.0.11.

  • added VS2013 version
  • added a "separated assembly" versions. Until now I was using RedGate's Smart Assembly to pack all referenced assemblies into a single DLL file for easier management and distribution. However, this black magic might cause problems in certain situations (Visual Studio add-ins screams for problems). Thus I've added another set that features assemblies in separate files. The bottom line, if you have problems or you want to be on the safe side, use the later set.

Go nuts!

Poor man’s performance profiling

If you have been doing development with Xamarin on Android you have probably noticed the total absence of any profiler, there is not even a decent method to get the high resolution time. Sure, there is Environment.TickCount but the resolution is at best in milliseconds. Pretty useless for performance profiling.

Rarely people use performance profilers on the desktop. Mostly because their applications don’t have complicated stuff and the modern CPUs are hiding most of the non performing code. However mobile devices are far more delicate and have less CPU power, less RAM, less resources in general. Thus slow code is much more notable and visible to end users. There is no better way to profile your code with a performance profiler. Just take a look at RedGate’s ANTS Performance profiler. Unfortunately there are no such tools for Xamarin Android platform even though mobile code should be really optimized.

I present  my solution for getting some useable metrics from your code (the code you have sources for). The code I’ve written was used for MvvmCross performance profiling hence it slightly relies on it (logging, IoC) however it can be easily made standalone if necessary.

Getting high resolution time

Java has java.lang.System.nanoTime() method that returns the highest resolution time available on Android. Xamarin doesn’t provide it through .net, you have to write a wrapper by yourself using java reflection.

public static class JavaLangSystem
{
    static IntPtr class_ref;
    static IntPtr id_nanoTime;
    public readonly static long Avg;

    static JavaLangSystem()
    {
        class_ref = JNIEnv.FindClass("java/lang/System");
        id_nanoTime = JNIEnv.GetStaticMethodID(class_ref, "nanoTime", "()J");

        for (int i = 0; i < 10; i++)
        {
            NanoTime();
        }

        long start = NanoTime();
        for (int i = 0; i < 1000; i++)
        {
            NanoTime();
        }
        Avg = (NanoTime() - start) / 1000;
MvxTrace.TaggedTrace("metrics", "Avg NanoTime reflection is {0}", Avg); } [Register("nanoTime", "()J", "")] public static long NanoTime() { return JNIEnv.CallStaticLongMethod(class_ref, id_nanoTime); } }

Note that I do some calibration in static constructor because of reflection the call to nanoTime() itself consumes some time. I store the average call time into a static variable Avg that I'll later use.

Accessing high resolution timer through IoC

First part is to create an interface

public interface IHighResolution
{
    long Ticks { get; }
    long Avg { get; }
}

then implement it

public class HighResolution: IHighResolution
{

    #region IHighResolution Members

    public long Ticks
    {
        get { return JavaLangSystem.NanoTime(); }
    }

    public long Avg
    {
        get { return JavaLangSystem.Avg; }
    }

    #endregion
}

and finally register it to IoC provider.

protected override IMvxApplication CreateApp()
{
    Mvx.RegisterType<IHighResolution, HighResolution>();
    return new App();
}

I register it in MvxAndroidSetup derived class (MvvmCross initialization) but if you are not using MvvmCross feel free to register it with IoC provider of your choice on some other initialization place.

This separation comes specially handy if you are to measure PCL assemblies where nanoTime() isn’t accessible.

 

Define class that will store a single measurement

[DebuggerDisplay("{Name} in {Ticks}")]
public class PerfMeasurement
{
    public string Name { get; set; }
    public List<PerfMeasurement> Children;
    public long Ticks { get; set; }

    public int AllChildren
    {
        get
        {
            if (Children == null || Children.Count == 0)
                return 0;
            else
            {
                int sum =0;
                foreach (PerfMeasurement m in Children)
                {
                    sum += 1 + m.AllChildren;
                }
                return sum;
            }
        }
    }
}

There are two interesting features here. First is Children field which holds nested measurements. And the other is AllChildren property that recursively counts all children.

The main measurement class

public static class Metrics
{
    private static readonly IHighResolution highresolution;
    private static readonly long twoAvg;
    
    public static List<PerfMeasurement> RootData { get; private set; }
    private static Stack<PerfMeasurement> stack;
    //         private static int counter;
    
    static Metrics()
    {
        RootData = new List<PerfMeasurement>(100000);
        stack = new Stack<PerfMeasurement>();
        highresolution = Mvx.Resolve<IHighResolution>();
        twoAvg = highresolution.Avg * 2;
    }
    
    public static PerfMeasurement Push(string name)
    {
        PerfMeasurement head;
        if (stack.Count == 0)
        {
            head = new PerfMeasurement { Name = name };
            RootData.Add(head);
        }
        else
        {
            PerfMeasurement previous = stack.Peek();
            head = new PerfMeasurement { Name = name };
            if (previous.Children == null)
                previous.Children = new List<PerfMeasurement>();
            previous.Children.Add(head);
        }
        stack.Push(head);
        return head;
    }
    
    public static void Pull()
    {
        stack.Pop();
        if (stack.Count == 0)
        {
            PerfMeasurement m = RootData[RootData.Count - 1];
            OffsetParents(m);
            Print(m);
        }
    }
    
    private static void OffsetParents(PerfMeasurement m)
    {
        m.Ticks -= m.AllChildren * twoAvg;
        if (m.Children != null)
        {
            foreach (PerfMeasurement child in m.Children)
                OffsetParents(child);
        }
    }
    
    public static void Print(PerfMeasurement metric)
    {
        StringBuilder sb = new StringBuilder();
        Print(metric, sb, 0);
        MvxTrace.TaggedTrace("metrics", sb.ToString());
    }
    
    public static void Print(PerfMeasurement metric, StringBuilder sb, int depth)
    {
        string prepend = new string(' ', depth * 4);
        sb.AppendLine(string.Format("{0}{1} took {2:#,##0}", prepend, metric.Name, metric.Ticks));
        if (metric.Children != null && metric.Children.Count > 0)
        {
            foreach (PerfMeasurement child in metric.Children)
            Print(child, sb, depth + 1);
        }
    }
}

This class acts like a Stack for measurement items defined in earlier one. When item at the root level is pulled it outputs the results of the whole hierarchy linked to it into the (MvvmCross) logging mechanism (Print methods). But before doing that it recursively subtracts the nanoTime() invocation of all children (OffsetParents - for each children it assumes nanoTime() took two average invocations – hence the static field twoAvg).

That’s all we need. However there is one step that will make measurement a bit easier and more clean.

 

Making it easier to use Metrics with a helper class

public class MeasurePerf : IDisposable
{
    private static readonly IHighResolution highresolution;

    private long start;
    private PerfMeasurement m;

    static MeasurePerf()
    {
        highresolution = Mvx.Resolve<IHighResolution>();
    }

    public MeasurePerf(string name)
    {
        m = Metrics.Push(name);
        this.start = highresolution.Ticks;
        
    }

    public void Dispose()
    {
        m.Ticks += highresolution.Ticks - start - highresolution.Avg;
        Metrics.Pull();
    }
}

This class is intended as a wrapper around measured code using using keyword. Measurement looks like:

using (new MeasurePerf("Your comment here"))
{
    // your code here
    // nesting measuring is perfectly fine
}

Here is a sample output I’ve been looking at:

profiler_results

I highlighted the root item. All items are showing their total time (children included). Note the negative numbers. That happens because this kind of measurement isn’t accurate at all (remember, I am subtracting twoAvgs). However it gives you at least some insight of what code is slow and which child is guilty.

So, that’s it. Until Xamarin gives us profilers there is no better way AFAIK. Unfortunately. There is one thing you can do, though. You can tell Xamarin that we need profilers. Perhaps through my uservoice suggestion.

A week of impressions of developing for Android using Xamarin & MvvmCross

Occasionally I watch road bicycling and it happens that I started following Vuelta 2013. Of course I downloaded the official application for Android as well - to keep an eye on standings. The poor quality of the Android application bothered me a bit. Both the lack of data and poor UI. Most notably I really found it hard to see where my favorite bicyclists are standing.

This triggered a well known feeling in me - I have to create my own Android application instead. For quite some time I was eying Android development but so far I didn't have a proper motivation.Armed with Xamarin and MvvmCross I dived in. I had almost no experience with Android OS development, very little of Xamarin and none of MvvmCross. The goal was to create a MVVM based application that could leverage the common code along different mobile platforms though the current state is Android only application. I could create a version for iOS and Windows Phone 8 as well, I guess, but I don't have a Mac (required to build for iOS) nor I want to loose a ton of time to enable Windows Phone development environment for a free application (which is a shame because WP8 development doesn't look bad at all). Perhaps sometime.

The application itself is available for free through Google Play. Go, get it, while it is still actual.

About Xamarin

What they have done is simply amazing. Porting Mono (.net) stack to mobile platforms that is. Including the latest crazy useful async/await.This gives a .net developer an option to provide common code base for Windows RT, desktop Windows, Windows Phone, Android and iOS and to unleash the .net on platforms MS didn't want you to. The community is there. But it comes with some quirks. Here are some major ones I found while developing with Visual Studio 2012:

  • debugger is often slow. Sometimes it takes quite a lot of tenths of seconds to be able to inspect variables at a breakpoint
  • async/await debugger breakpoints stop at random lines of code and often "Step Over" or "Step Into" means "Continue running"
  • Portable Class Library support is still a problem (most notably, VS Android projects can't reference PCL assemblies while Xamarin IDE intellisense goes beserk on PCLs)
  • no profilers available (performance, memory) for Android. These are insanely important for mobile devices. Vote here.
  • price for Indie developers (small shops) might be high

I am not exactly sure whether some of these problems might be attributed to VS or Xamarin integration. Anyway, none of these is a showstopper and I am sure they'll be addressed.

About MvvmCross

This framework is an excellent open source MVVM starting point for cross platform .net mobile development. @slodge. the author, has done and is still doing an insane job. Not just by creating the framework but he is all over the Internet answering questions and writing (well, mostly recording) documentation and samples. That said there are some problems with MvvmCross:

  • huge framework, takes time to grasp
  • I miss better fundamentals description (i.e how DataContext is transferred to children)
  • sometimes slow (i.e. startup time, ListView bindings). NOTE: since there is no performance profiler available I couldn't exactly determine why it acts slowly sometimes. Could be Xamarin, could be MvvmCross or Android or my code.
  • some error messages could be better (i.e. suggesting you a solution instead of just reporting errors)

But hey, it is open source and I plan to contribute if time permits.

About Android

It is not hard to learn the basics but it has plenty of small issues and traps one has to be aware of. The fragmentation doesn't help either nor does java stack but overall it isn't that big of an issue.

Then there are Intel Android x86 images which are used if you want to run the emulator. (Forget about ARM ones due to the slowness). Intel Android x86 v17 image has a quirk that logs messages like nuts making log window pretty much useless. Then there is the v18 image that fixed overlogging issue but introduced a new one. It'd throw "can't resolve host" when using HttpClient 99% of the time. This issue might be attributed to Xamarin though as its browser doesn't exhibit same problems.

Google Play store publishing

Straightforward, no issues. Wait, there is a potential problem. If you want to publish an app on Google Play store you have to comply to US laws, most notably they warn you of using encryption. I wonder why (rhetorical).

So, these are my impressions, after a week of development. Note that I mostly listed problems I've encountered not described the (bright points of) products in details.

Feel free to feedback me.

Tips for Setting Git on Windows for SSH

Last weekend I took some time to configure Git (and Gitolite for easier repository access management) on my Ubuntu server. It was a fun experience for a Linux newbie. Anyway, here are a bunch of things to watch out when configuring Git client on Windows.

If you want to use OpenSSH (SSH that comes with Git)

  1. Always generate public/private key pair using ssh-keygen -r rsa even when using PuTTY. (I initially made a mistake because I used Puttygen).
  2. Set the environment variable HOME=C:\Users\[USERNAME], this variable isn't set by default. SSH looks for private keys in the C:\Users\[USERNAME]\.ssh folder. Not sure whether there are options to instruct it to looks elsewhere.
  3. Use default key name which is id_rsa. From what I can tell SSH looks for identity and id_rsa names (no suffixes).

When using PuTTY (an alternative to OpenSSH)

  1. Set environment variable GIT_SSH = C:\Program Files (x86)\PuTTY\plink.exe (path valid for x64 system). This variable will tell Git what library to use for SSH communication.
  2. Run Pageant (part of PuTTY distribution) which will handle your private keys. Added bonus is that you don't have to repeat passphrase to private key when accessing it (granted you set the passphrase when you created the key - always a good practice).

Other tip would include trying the SSH connection to the server first using verbose output: SSH -vvv [USERONSERVER]@[SERVER]. This output helps to a certain extent.

Making JSON data strong typed with TypeScript and CodeSmith

The situation

Imagine a scenario where you have a JSON service that returns data and you'd like to have it strongly typed on javascript powered client side.

Let's say there is an ASP.NET MVC application that goes by the name MvcApplication33 (yes, the 33rd in a row of test applications) . There are two models in there in two files under Models folder:

namespace MvcApplication33.Models
{
    public class Customer
    {
        public string Name { get; set; }
        public string Surname { get; set; }
        public int Age { get; set; }
        public Order[] Orders { get; set; }
    }
}

namespace MvcApplication33.Models
{
    public class Order
    {
        public double Amount { get; set; }
        public string Category { get; set; }
        public bool IsActive { get; set; }
    }
}

There is also a ApiController derived CustomersController like:

namespace MvcApplication33.Controllers
{
    public class CustomersController: ApiController
    {
        public IEnumerable<Customer> GetCustomers()
        {
            return new Customer[]{ new Customer
            {
                Name = "Tubo",
                Age = 22,
                Orders = new[]{ new Order { Amount = 54, IsActive = true, Category = "waste"} }
            }};
        }
    }
}

It simply returns an Customer array consisting of a single customer with a single order. The javascript, well jQuery, code that gets this data would look like:

$(function () {
        $.getJSON("/api/customers", null, function (d) {
            var jsonCustomer = d;
        });
    });

The problem

While the code above works there is a drawback (mostly for people coming from strong typed languages): on the client side there is no metadata information at design time whatsoever. One is left with dynamic constructs. TypeScript addresses this with interfaces. One could write matching TypeScript interfaces for C# types like:

interface ICustomer {
    Name: string;
    Surname: string;
    Age: number;
    Orders: IOrder[];
}
interface IOrder {
    Amount: number;
    Category: string;
    IsActive: bool;
}

and then rewrite retrieval like

$(function () {
    $.getJSON("/api/customers", null,
        d =>
        {
            var customer = <ICustomer[]>d;
        });
});

This way customer becomes an instance of a type that implements ICustomer which means properties are now strong typed and intellisense works. Of course this is only TypeScript design time sugar which doesn't reflect in generated javascript code but that's enough to prevent a zillion of typing and other "easy to catch at design time" errors.

There is one problem though. Typing interfaces is boring, error prone and there are synchronization issues between manually typed ones and their C# originals.

The solution

There is already all metadata information for our interfaces in C# code. Hence I created a CodeSmith template that automatically creates TypeScript interfaces based on their C# originals by parsing C# code. When C# code changes the template can be rećrun to recreate interfaces. Almost one click error-less operation.

Here is how it works: the template parses all C# files in a given folder and subfolders and generates matching TypeScript interfaces with corresponding namesapaces (using TypeScript modules). The template output for the given problem would look like:

// Autogenerated using CodeSmith and JsonGenerator
// © Righthand, 2013
module MvcApplication33.Models {
export interface ICustomer {
    Name: string;
    Surname: string;
    Age: number;
    Orders: IOrder[];
}
export interface IOrder {
    Amount: number;
    Category: string;
    IsActive: bool;
}

    export module Subform {
        export interface ISubclass {
            Tubo: bool;
        }

    }
}

Just for testing nested interfaces there is also interface ISubclass from original file located in Models subfolder named (Subform).

 

The TypeScript file that uses the mentioned autogenerated interfaces should reference autogenerated interfaces file using <reference> directive. Here is a sample TypeScript test:

/// <reference path="typings/jquery/jquery.d.ts" />
/// <reference path="../CodeSmith/JsonInterfaces.ts" />
module KnockoutTest {
    $(function () {
        $.getJSON("/api/customers", null,
            d =>
            {
                var customers = <MvcApplication33.Models.ICustomer[]>d;
                alert(customer[0].Name);
            });
    });
}

How to

The template comes in two parts. An actual CodeSmith template (JsonInterfaces.cst) and a .net assembly (KnockoutGenerator.Extractor.Parser.dll - name should give you a hint where all this is going in a next blog post) which is used to extract metadata from C# sources. The assembly code could be a part of CodeSmith template but it is easier to develop (read: debug) code within Visual Studio. The assembly in turn uses NRefactory (free C# parser and much more) which (v4.x) is a part of CodeSmith, so no additional files required.

Perhaps the easiest way to use the template is to include these two files in a project and use CodeSmith from within Visual Studio. The template requires a single property: FolderWithModels which is rather self explanatory. Excerpts from attached sample project:

The CodeSmith project content located in Project1.csp:

<?xml version="1.0" encoding="utf-8"?>
<codeSmith xmlns="http://www.codesmithtools.com/schema/csp.xsd">
  <propertySets>
    <propertySet name="JsonInterfaces" output="JsonInterfaces.ts" template="JsonInterfaces.cst">
      <property name="FolderWithModels">..\Models</property>
    </propertySet>
  </propertySets>
</codeSmith>

The project structure. I tend to put CodeSmith related stuff in CodeSmith folder. Feel free to arrange it otherwise.

folder structure

Right click on Project1.csp and Generate Outputs should (re)generate JsonInterfaces.ts. Open the output file and if Web Essentials and TypeScript are installed it should (re)generate final JavaScript file.

The project itself won't show any meaningful output (for now) but you can use JSON output in a strongly typed way.

Have fun and read next post which will talk about further enhancements for knockoutjs.

You can download my sample project (without NuGet packages, just sources) which includes everything you need (subfolder CodeSmith).

MvcApplication33.zip (566.78 kb)

NOTE: The CodeSmith template could be rewritten to T-4 (using the same support assembly) however, one should make sure that NRefactory assemblies are there otherwise parser won't work.