Poor man’s performance profiling

If you have been doing development with Xamarin on Android you have probably noticed the total absence of any profiler, there is not even a decent method to get the high resolution time. Sure, there is Environment.TickCount but the resolution is at best in milliseconds. Pretty useless for performance profiling.

Rarely people use performance profilers on the desktop. Mostly because their applications don’t have complicated stuff and the modern CPUs are hiding most of the non performing code. However mobile devices are far more delicate and have less CPU power, less RAM, less resources in general. Thus slow code is much more notable and visible to end users. There is no better way to profile your code with a performance profiler. Just take a look at RedGate’s ANTS Performance profiler. Unfortunately there are no such tools for Xamarin Android platform even though mobile code should be really optimized.

I present  my solution for getting some useable metrics from your code (the code you have sources for). The code I’ve written was used for MvvmCross performance profiling hence it slightly relies on it (logging, IoC) however it can be easily made standalone if necessary.

Getting high resolution time

Java has java.lang.System.nanoTime() method that returns the highest resolution time available on Android. Xamarin doesn’t provide it through .net, you have to write a wrapper by yourself using java reflection.

public static class JavaLangSystem
{
    static IntPtr class_ref;
    static IntPtr id_nanoTime;
    public readonly static long Avg;

    static JavaLangSystem()
    {
        class_ref = JNIEnv.FindClass("java/lang/System");
        id_nanoTime = JNIEnv.GetStaticMethodID(class_ref, "nanoTime", "()J");

        for (int i = 0; i < 10; i++)
        {
            NanoTime();
        }

        long start = NanoTime();
        for (int i = 0; i < 1000; i++)
        {
            NanoTime();
        }
        Avg = (NanoTime() - start) / 1000;
MvxTrace.TaggedTrace("metrics", "Avg NanoTime reflection is {0}", Avg); } [Register("nanoTime", "()J", "")] public static long NanoTime() { return JNIEnv.CallStaticLongMethod(class_ref, id_nanoTime); } }

Note that I do some calibration in static constructor because of reflection the call to nanoTime() itself consumes some time. I store the average call time into a static variable Avg that I'll later use.

Accessing high resolution timer through IoC

First part is to create an interface

public interface IHighResolution
{
    long Ticks { get; }
    long Avg { get; }
}

then implement it

public class HighResolution: IHighResolution
{

    #region IHighResolution Members

    public long Ticks
    {
        get { return JavaLangSystem.NanoTime(); }
    }

    public long Avg
    {
        get { return JavaLangSystem.Avg; }
    }

    #endregion
}

and finally register it to IoC provider.

protected override IMvxApplication CreateApp()
{
    Mvx.RegisterType<IHighResolution, HighResolution>();
    return new App();
}

I register it in MvxAndroidSetup derived class (MvvmCross initialization) but if you are not using MvvmCross feel free to register it with IoC provider of your choice on some other initialization place.

This separation comes specially handy if you are to measure PCL assemblies where nanoTime() isn’t accessible.

 

Define class that will store a single measurement

[DebuggerDisplay("{Name} in {Ticks}")]
public class PerfMeasurement
{
    public string Name { get; set; }
    public List<PerfMeasurement> Children;
    public long Ticks { get; set; }

    public int AllChildren
    {
        get
        {
            if (Children == null || Children.Count == 0)
                return 0;
            else
            {
                int sum =0;
                foreach (PerfMeasurement m in Children)
                {
                    sum += 1 + m.AllChildren;
                }
                return sum;
            }
        }
    }
}

There are two interesting features here. First is Children field which holds nested measurements. And the other is AllChildren property that recursively counts all children.

The main measurement class

public static class Metrics
{
    private static readonly IHighResolution highresolution;
    private static readonly long twoAvg;
    
    public static List<PerfMeasurement> RootData { get; private set; }
    private static Stack<PerfMeasurement> stack;
    //         private static int counter;
    
    static Metrics()
    {
        RootData = new List<PerfMeasurement>(100000);
        stack = new Stack<PerfMeasurement>();
        highresolution = Mvx.Resolve<IHighResolution>();
        twoAvg = highresolution.Avg * 2;
    }
    
    public static PerfMeasurement Push(string name)
    {
        PerfMeasurement head;
        if (stack.Count == 0)
        {
            head = new PerfMeasurement { Name = name };
            RootData.Add(head);
        }
        else
        {
            PerfMeasurement previous = stack.Peek();
            head = new PerfMeasurement { Name = name };
            if (previous.Children == null)
                previous.Children = new List<PerfMeasurement>();
            previous.Children.Add(head);
        }
        stack.Push(head);
        return head;
    }
    
    public static void Pull()
    {
        stack.Pop();
        if (stack.Count == 0)
        {
            PerfMeasurement m = RootData[RootData.Count - 1];
            OffsetParents(m);
            Print(m);
        }
    }
    
    private static void OffsetParents(PerfMeasurement m)
    {
        m.Ticks -= m.AllChildren * twoAvg;
        if (m.Children != null)
        {
            foreach (PerfMeasurement child in m.Children)
                OffsetParents(child);
        }
    }
    
    public static void Print(PerfMeasurement metric)
    {
        StringBuilder sb = new StringBuilder();
        Print(metric, sb, 0);
        MvxTrace.TaggedTrace("metrics", sb.ToString());
    }
    
    public static void Print(PerfMeasurement metric, StringBuilder sb, int depth)
    {
        string prepend = new string(' ', depth * 4);
        sb.AppendLine(string.Format("{0}{1} took {2:#,##0}", prepend, metric.Name, metric.Ticks));
        if (metric.Children != null && metric.Children.Count > 0)
        {
            foreach (PerfMeasurement child in metric.Children)
            Print(child, sb, depth + 1);
        }
    }
}

This class acts like a Stack for measurement items defined in earlier one. When item at the root level is pulled it outputs the results of the whole hierarchy linked to it into the (MvvmCross) logging mechanism (Print methods). But before doing that it recursively subtracts the nanoTime() invocation of all children (OffsetParents - for each children it assumes nanoTime() took two average invocations – hence the static field twoAvg).

That’s all we need. However there is one step that will make measurement a bit easier and more clean.

 

Making it easier to use Metrics with a helper class

public class MeasurePerf : IDisposable
{
    private static readonly IHighResolution highresolution;

    private long start;
    private PerfMeasurement m;

    static MeasurePerf()
    {
        highresolution = Mvx.Resolve<IHighResolution>();
    }

    public MeasurePerf(string name)
    {
        m = Metrics.Push(name);
        this.start = highresolution.Ticks;
        
    }

    public void Dispose()
    {
        m.Ticks += highresolution.Ticks - start - highresolution.Avg;
        Metrics.Pull();
    }
}

This class is intended as a wrapper around measured code using using keyword. Measurement looks like:

using (new MeasurePerf("Your comment here"))
{
    // your code here
    // nesting measuring is perfectly fine
}

Here is a sample output I’ve been looking at:

profiler_results

I highlighted the root item. All items are showing their total time (children included). Note the negative numbers. That happens because this kind of measurement isn’t accurate at all (remember, I am subtracting twoAvgs). However it gives you at least some insight of what code is slow and which child is guilty.

So, that’s it. Until Xamarin gives us profilers there is no better way AFAIK. Unfortunately. There is one thing you can do, though. You can tell Xamarin that we need profilers. Perhaps through my uservoice suggestion.

A week of impressions of developing for Android using Xamarin & MvvmCross

Occasionally I watch road bicycling and it happens that I started following Vuelta 2013. Of course I downloaded the official application for Android as well - to keep an eye on standings. The poor quality of the Android application bothered me a bit. Both the lack of data and poor UI. Most notably I really found it hard to see where my favorite bicyclists are standing.

This triggered a well known feeling in me - I have to create my own Android application instead. For quite some time I was eying Android development but so far I didn't have a proper motivation.Armed with Xamarin and MvvmCross I dived in. I had almost no experience with Android OS development, very little of Xamarin and none of MvvmCross. The goal was to create a MVVM based application that could leverage the common code along different mobile platforms though the current state is Android only application. I could create a version for iOS and Windows Phone 8 as well, I guess, but I don't have a Mac (required to build for iOS) nor I want to loose a ton of time to enable Windows Phone development environment for a free application (which is a shame because WP8 development doesn't look bad at all). Perhaps sometime.

The application itself is available for free through Google Play. Go, get it, while it is still actual.

About Xamarin

What they have done is simply amazing. Porting Mono (.net) stack to mobile platforms that is. Including the latest crazy useful async/await.This gives a .net developer an option to provide common code base for Windows RT, desktop Windows, Windows Phone, Android and iOS and to unleash the .net on platforms MS didn't want you to. The community is there. But it comes with some quirks. Here are some major ones I found while developing with Visual Studio 2012:

  • debugger is often slow. Sometimes it takes quite a lot of tenths of seconds to be able to inspect variables at a breakpoint
  • async/await debugger breakpoints stop at random lines of code and often "Step Over" or "Step Into" means "Continue running"
  • Portable Class Library support is still a problem (most notably, VS Android projects can't reference PCL assemblies while Xamarin IDE intellisense goes beserk on PCLs)
  • no profilers available (performance, memory) for Android. These are insanely important for mobile devices. Vote here.
  • price for Indie developers (small shops) might be high

I am not exactly sure whether some of these problems might be attributed to VS or Xamarin integration. Anyway, none of these is a showstopper and I am sure they'll be addressed.

About MvvmCross

This framework is an excellent open source MVVM starting point for cross platform .net mobile development. @slodge. the author, has done and is still doing an insane job. Not just by creating the framework but he is all over the Internet answering questions and writing (well, mostly recording) documentation and samples. That said there are some problems with MvvmCross:

  • huge framework, takes time to grasp
  • I miss better fundamentals description (i.e how DataContext is transferred to children)
  • sometimes slow (i.e. startup time, ListView bindings). NOTE: since there is no performance profiler available I couldn't exactly determine why it acts slowly sometimes. Could be Xamarin, could be MvvmCross or Android or my code.
  • some error messages could be better (i.e. suggesting you a solution instead of just reporting errors)

But hey, it is open source and I plan to contribute if time permits.

About Android

It is not hard to learn the basics but it has plenty of small issues and traps one has to be aware of. The fragmentation doesn't help either nor does java stack but overall it isn't that big of an issue.

Then there are Intel Android x86 images which are used if you want to run the emulator. (Forget about ARM ones due to the slowness). Intel Android x86 v17 image has a quirk that logs messages like nuts making log window pretty much useless. Then there is the v18 image that fixed overlogging issue but introduced a new one. It'd throw "can't resolve host" when using HttpClient 99% of the time. This issue might be attributed to Xamarin though as its browser doesn't exhibit same problems.

Google Play store publishing

Straightforward, no issues. Wait, there is a potential problem. If you want to publish an app on Google Play store you have to comply to US laws, most notably they warn you of using encryption. I wonder why (rhetorical).

So, these are my impressions, after a week of development. Note that I mostly listed problems I've encountered not described the (bright points of) products in details.

Feel free to feedback me.