Fixing combination of NuGet and Team Foundation in workgroup configuration: 401 Unauthorized

The problem

A lot of users of Visual Studio 2010 (SP1), Team Foundation Server in workgroup and NuGet faced a very annoying problem – often we’d get 401-Unauthorized when installing/uninstalling/updating a NuGet package. Apparently it happens only in this combination (not sure if my host OS – Windows 7 plays any role in it) and not consistently. But when it starts the only way to get rid of errors is to restart Visual Studio.

The only workaround so far was to:

  1. Go Offline with TFS
  2. Manually make files writable (csproj, packages configuration, etc.) or uncheck them before #1
  3. Close Visual Studio
  4. Open Visual Studio
  5. Do NuGet
  6. Close Visual Studio
  7. Open Visual Studio
  8. Go Online with TFS

The steps above were mandatory for every batch of NuGet operations. Which is a huge pain and absurdly great annoyance with, otherwise excellent, NuGet. Needless to say I was among the people facing this issue. And I get so annoyed that I decided to make a choice at that point: either ditch NuGet or fix it myself (NuGet is an open source project).

Being a developer I opted for second choice of course. Was there really a choice? Anyway, here is how it went my 24hrs of debugging and 15s fixing. If you just want to see the solution feel free to skip to the Solution below.


1. I downloaded NuGet sources.

2. When opening NuGet solution I quickly find out that I was missing Visual Studio 2010 SDK (because NuGet is an Visual Studio extension) so I downloaded the one I’ve found on the Internet. And it didn’t install saying something about prerequisites not installed. Ah, one needs Visual Studio 2010 SP1 SDK. Get it here. Somebody, please let know Visual Studio Extensibility Developer Center that they are listing the outdated SDK.

3. I set NuGet.VsExtension as Start Up project and fired up the debugger. Which opens another instance of Visual Studio 2010 where I’ve crafted up a sample solution used for reproducing the dreadful 401. I was able to reproduce it often but not always.

4. It took me some time to get familiar with NuGet sources. After that I invested some time to speed up the problem detection (as soon as possible the better) by modifying pieces of NuGet sources and after many trials and errors I’ve found that I have to dig deeper, into the bowels of Team Foundation System Client code.

5. I fired up my preferred tool for debugging assemblies for which I don’t have sources – .net reflector. It works better than using MS source symbols and it works for every assembly. It doesn’t work perfectly but that’s due to assembly optimizations and other black magic issues but it works well enough. Armed with decompiled TFS client assemblies I dug deeper and deeper. But couldn’t find an obvious fault.

6. I brought up a new weapon: Microsoft Network Monitor to analyse the network traffic. After all TFS communication is through HTTP/SOAP. There I’ve found the first clue to the root of the problem. Normally TFS client would send a request that would be refused by server with response saying that NTLM authentication is required. The client would re-send request with NTLM authentication and everything would work. But when the problem occurs the client just doesn’t respond to NTLM challenge – instead it just throws 401 unauthorized exception without even trying to authenticate against the server. I had no idea why it sometimes work and sometimes not.

Successful communication
Successful communication

Unsuccessful communication
Unsuccessful communication

7. At this point I was thinking of enabling System.Net tracing to get more useful info if possible. Immediately I faced a problem. The only way to enable System.Net is through app.config file but not in code. See, I couldn’t use app.config file because I was debugging a library and library’s app.config file is simply ignored. I’ve looked for a way to enable tracing programmatically in code, which is possible for user tracing scenarios, but not for System.Net. Bad luck, but there is nothing that can’t be fixed with a bit of reflection, like this:

private static void InitLogging()
    TextWriterTraceListener listener = new TextWriterTraceListener(@"D:\temp\ts.log");
    Type type = typeof(HttpWebRequest).Assembly.GetType("System.Net.Logging");
    MethodInfo initl = type.GetMethod("InitializeLogging", System.Reflection.BindingFlags.Static | System.Reflection.BindingFlags.NonPublic);
    initl.Invoke(null, null);

    foreach (string s in new string[] { "s_WebTraceSource", "s_HttpListenerTraceSource", "s_SocketsTraceSource", "s_CacheTraceSource" })
        FieldInfo webTsFi = type.GetField(s, BindingFlags.Static | BindingFlags.NonPublic);
        TraceSource webTs = (TraceSource)webTsFi.GetValue(null);
        webTs.Switch.Level = SourceLevels.Verbose;
    FieldInfo le = type.GetField("s_LoggingEnabled", BindingFlags.Static | BindingFlags.NonPublic);
    le.SetValue(null, true);

And voila, the thing started to spit a ton of information into file D:\temp\ts.log. But again, it only showed the symptom but not the cause (trace parts after first request, note that unsuccessful one doesn’t even try to NTLM authenticate):

System.Net Information: 0 : [10488] Associating HttpWebRequest#51488348 with ConnectStream#13361802
System.Net Information: 0 : [10488] Associating HttpWebRequest#51488348 with HttpWebResponse#7364733
System.Net Information: 0 : [10488] AcquireDefaultCredential(package = NTLM, intent  = Outbound)
System.Net Information: 0 : [10488] InitializeSecurityContext(credential = System.Net.SafeFreeCredential_SECURITY, context = (null), targetName = HTTP/TFS, inFlags = Delegate, MutualAuth, Connection)
System.Net Information: 0 : [10488] InitializeSecurityContext(In-Buffers count=0, Out-Buffer length=40, returned code=ContinueNeeded).
System.Net Warning: 0 : [10488] HttpWebRequest#51488348::() - Resubmitting request.

Successful communication

System.Net Information: 0 : [10488] Associating HttpWebRequest#51488348 with ConnectStream#13361802
System.Net Information: 0 : [10488] Associating HttpWebRequest#51488348 with HttpWebResponse#7364733

Unsuccessful communication

8. At this point I concentrated on debugging System.Net.HttpWebRequest class as re-submitting is not done at TFS client level. After even more trial and errors I was finally able to pinpoint the root of the evil.

The root of the problem

The decision whether to or not to try NTLM authentication is based on which internet zone OS thinks the request target is. In other words if OS says that your TFS server is outside intranet then HttpWebRequest won’t bother with NTLM authentication at all. It is that simple. The decision lies within PresentationCore’s (!) internal CustomCredentialPolicy.InternetSecurityManager class which delegates the question about the internet zone to OS and returns the result to HttpWebRequest. For some reason at some point it starts to return Internet instead of Intranet. I am not sure exactly why, but I have a remedy. A dramatically simple one which doesn’t even involve modifications to NuGet (no need to wait for a NuGet fix!).

The solution

Open Internet Explorer browser, go to Internet Options/Security, select Local Intranet icon, click Sites button


On Local Intranet dialog click Advanced


and add your TFS server to the Websites list, like I did with mine (replace TFS with the name of your server)


Restart Visual Studio any enjoy NuGet from a new perspective!

This solution apparently solves all of the issues I had with the dreaded 401. Let me know if it works for you as well.


The problem might not be related to NuGet at all but rather to PresentationCore (NuGet is a WPF application) which gets confusing results from OS through some interop. NuGet/Visual Studio is just a combination that triggers the otherwise sleeping problem.

Two Windows 8 feature requests

Microsoft started blogging about Windows 8 (there is twitter account @BuildWindows8 as well) and I started thinking what I’d like to see in Windows 8. I can think of two features right now, out of my head:

  1. Make .net first class development tool. You might say that it is, but in reality it isn’t. Not all APIs are accessible through .net. Just look at (abandoned) Windows® API Code Pack for Microsoft® .NET Framework. Furthermore it is ridden with bugs. There are other APIs requiring black magic to use them from .net. Why are we, .net developers, supposed to mess with that?
  2. Give us a chance to store non-OS essential files on a separate drive. Starting with hibernation file which is as large as your RAM is. More or less. If that is 12GB it means you’ll have to give up 12GB of OS disk space. You might say who cares, 12GB is nothing with current disk prices. Sure, if you don’t look at obscene prices of non-mainstream disks (i.e. SSD) where 12GB matters. A lot. Then there are temporary files, user related documents, etc. A lot of stuff I’d be happy to offload to a cheaper and larger disk. Some of these can be redirected already, but mostly in obscure ways.

That much for now. What do you think?

SAZAS, država in računalništvo prvič, v1.1 (slovene)

Par novih dejstev glede SAZAS, država in računalništvo, prvič, predvsem po zaslugi @Bekstejdz –a.

1. SAZAS ne prejme ves znesek iz nadomestil temveč “le” 32-40%. Drugo si razdelijo Zavod IPF in rezervacije (karkoli že to pomeni). Letno poročilo Zavoda IPF za 2009.

2. Zgleda, da to nadomestilo ne pobira država, kot sem prvotno mislil, ampak pooblaščenec (pooblastilo da Urad RS za intelektualno lastnino) – do konec leta 2009 je bil to Zavod IPF, potem pa mu je potekla začasna licenca. Po preteku pa Urad ni izdal več začasne ali stalnega dovoljenja za pobiranja. Oba, SAZAS in Zavod IPF sta hotela pridobiti dovoljenje, vendar:

Po mnenju sodišča (sodbi opr. št. I U 1080/2010 in I U 1111/21010) mora za izdajo stalnega dovoljenja vložnik izkazati, da združuje vse raznovrstne upravičence do nadomestila. Za izdajo stalnega dovoljenja je prav tako nujno, da se v pravilih delitve nadomestila (ki morajo biti vsebovana v statutu kolektivne organizacije) določijo natančnejša merila za delitev nadomestila med posamezne upravičence.

Skratka, ni problem, da bi si država premislila ali kaj takega, problem je le ključ delitve in dejstvo, da noben posamezno ne zastopa vseh. Očitno je denarja dovolj “samo za enega” in za nas je dobro, dokler se kregajo sami med seboj. Zanimivo pa je še, da Urada ti dve dejstvi nista motili pri izdraji prve začasne odločbe Zavodu IPF - je takrat predstavljal vse in je imel jasen ključ deliteve?

3. Posledica točke 2. je ta, da od začetka 2010 noben ne izvršuje uredbe in pobira nadomestila. Vsaj tako zgleda.

4. Bistvo originalnega zapisa seveda ostaja, to, da se začasno ne znajo zmeniti kdo nas bo odiral ne spremeni praktično nič, prav tako ne spremeni dejstvo, da ne dobi vsega zneska SAZAS (potrebno je vprašati uvoznike, da se ugotovi, kaj se z nadomestili dogaja – pozna kdo kakšnega?).

SAZAS, država in računalništvo, prvič (slovene)

Ljudje se pritožujemo nad višino vseh možnih davkov, a ne. Redko kdo pa ve, da obstaja še ena zla dajatev, ki je še bistveno bolj prikrita in se ji strokovno reče nadomestilo. In to so nadomestila za privatno in drugo lastno repoduciranje - SAZASu na (računalniško) strojno opremo in medije.

Dejmo si takoj pogledat ekstremni, praktičen primer. Za 500GB trdi disk, ki se ga kupi v prosti prodaji ali že vgrajenega bo uvoznik plačal SAZASu 50% (petdeset procentov, ali z drugimi besedami POLOVICO). Sliši se neverjetno, ampak je žal resnično: 500GB disk stane nekje okrog 40€, odštejmo DDV in dobimo 33,3€. Od teh 33,3€ bo uvoznik odštel 16,7€ (polovica od 33,3€) za nadomestilo in ta del tega prejme SAZAS. In da smo si na jasnem, tale postavka na računu, ki ga prejme kupec, nikakor ni navedena.



NI mogoče? O, pa je. In to od dne 6.10.2006 naprej, ko je vlada evropskega svetilnika oz. njen vladar podpisal odredbo “o zneskih nadomestil za privatno in drugo lastno reproduciranje”. Tam piše takole, v 2. b) točki 2. člena:

Nadomestilo za tonsko ali vizualno snemanje varovanih del, ki se plačuje pri prvi prodaji ali uvozu novih praznih nosilcev zvoka ali slike, znaša za posamezen prazni nosilec zvoka in/ali slike, ki po deklaraciji proizvajalca omogoča:

2. digitalni zapis avdio in/ali vizualnih ter pisanih del, in sicer:

b) nosilec, ki ni izključno namenjen reproduciranju avdio in/ali vizualnih del:

– podatkovni CD,
– podatkovni DVD,
računalniški trdi disk,
– spominska kartica (na primer: CF, SD, SDHC),
– nosilec z integrirano spominsko enoto in predvajalnikom, ki ni izključno
namenjen reproduciranju digitalnih avdio in/ali vizualnih del (na primer:
mobilni telefon, dlančnik), in
– drug podoben nosilec
za vsakih začetih 1 GB zmogljivosti 8 SIT, vendar ne več kot 4000 SIT.

4.000 bivših SIT je natančno 16,691704223001168419295610081789€ ali 16.7€ na kratko. Ker je znesek (na srečno) navzgor omejen pride najbolj do izraza pri nakupu zgoraj omenjenega 500GB diska. Vendar odredba ni omejena na trde diske, kje pa, vse živo se plača, vključno z spominskimi karticami, ki jih imate v fotoaparatih. Kolikor je meni jasno, se ta denar delno steka v SAZAS (glej spodaj razpredelnico o delitvi prilivov).

Novost: Zgleda, da se uredba trenutno ne izvaja, zaradi kregarij, kdo bo pobiral denar. Več v posodobitvi 1.1.




Vsi smo krivi

Tale odredba je vsekakor nastala pod vplivom oslovske sodbe v Višnji gori. Predpostavlja namreč, da smo vsi krivi prepisovanja SAZASovih in drugih vsebin, prav vsi, ki kupimo poljubno napravo ali medij iz Odredbe in vse nelegalne vsebine bomo zapisovali na vse kupljene medije z uporabo vseh kupljenih naprav. Brez izjeme. Ko kupiš, si kriv. In ker si kriv, potem plačaj pavšal. Ni ti pa potrebno vedeti, da si kriv in da plačaš, zato pa postavka ni nikjer navedena. Plačilo naj opravi kar zlodej (uvoznik), ki take stvari prodaja. Seveda z denarjem kupcev. Človek se seveda vpraša, če sem že avtomatično kriv in za to še plačam (čeprav nevede) a potem lahko dejansko legalno prepisujem to vsebino? Ne. Namreč, če se prepiše zaščiteno vsebino in storilca policaj dobi med delom, se plača kazen. Plača se nekaj, kar se je že plačalo ob nakupu zločinskega medija in naprav. Pa še kartoteko se dobi. Le, da se tokrat zavedno plača za storjeni zločin (in ne nevedno za možni zločin). Torej sodeč po odredbi smo vsi krivi vnaprej in zato plačamo, če smo pa res krivi, potem pa še enkrat plačamo.

Toda krog plačil se tukaj ne konča. Evropsko združenje SAZASov (med njimi tudi naš preljubi) sedaj zahteva, da bi se pavšal plačeval še pri ponudnikih interneta (SIOL, T-2, etc.), spet pavšalno, ker se pač preko njih pretaka nelegalna vsebina. Hočeš nočeš, bi vsi ponudniki morali plačevat pavšal, ker se preko njih lahko pretaka nelegalna vsebina.


Slabo za razvoj

Se še spomnite megalomanskih stavkov o svetilniku Evrope (iste vlade in vladarja podpisanega pod uredbo), kako bomo postali oh in sploh napredni. Seveda, ni boljšega načina kot obesiti pijavke na računalniške naprave in medije (ne pozabiti, 50% za 500GB disk), denar preusmeriti izven tehnoloških voda, in tehnološki napredek je zagotovljen.

Denarni tok

Kam točno gre denar, ki se ga na zgornji način pobira? Očitno gre tretjina SAZASu (glej spodaj razpredelnico o delitvi prilivov), ki je “neprofitna” organizacija. In ta “neprofitna” organizacija prejema od države (od kupcev strojne opreme in medijev) na tone denarja, če vse to drži. Koliko denarja? Kdo bi vedel. Komu točno gre ta denar in po kakšnem ključu? Kdo bi vedel, SAZAS takih in podobnih podatkov ne daje, vlado pa očitno tudi ne zanima.  In to koristi tehnološkemu preboju Slovenije kako?

Zaključek prvega dela

Če na kratko povzame, naša država lupi kupce strojne opreme in medijev za zločine, ki bi jih le-ti lahko storili, to očitno prikriva (kupec se ne zaveda tega) in pobran denar namenja neki neprofitni organizaciji, kjer le-ta ponikne. Film Minority report je vsaj temeljil na nekih specifičnih domnevah jasnovidcev, tukaj je pa vlada posplošeno jasnovidna.

Dopuščam možnost, da se motim, vendar amaterski pogled na to odredbo kaže tako. Popravki so zaželeni in dobrodošli.

Zgodba se nadaljuje.

Novost: Razpredelnice delitve prilivov iz Odredbe za 2009. SAZAS dobi "samo" 32%, ostalo si razdelijo drugi. Nič bolj ni jasno, komu gre koliko denarja. Sem malo prilagodil zgornji tekst temu dejstvu. Torej ne dobi samo SAZAS, ampak tudi drugi, bistvo zgodbe ostaja isto.

Posodobitev - v1.1

Integrating MvcMiniProfiler and LLBLGenPro

MvcMiniProfiler is a lean and mean mini profiler for MVC 3 that shows the profiling results on each page displayed at runtime. Besides the custom steps you can seed everywhere in your code it supports database profiling as well. Out of the box are supported, Linq to SQL and Entity Framework. LLBLGenPro, my favorite ORM, isn’t supported though and it won’t work just like that.

Luckily, it turns out, it requires just a little effort to integrate MvcMiniProfiler into LLBLGenPro.

How does MvcMiniProfiler database profiling works

The way it works is that it wraps DbConnection, DbCommand and other Db[Stuff] and thus records the execution time by tracking their inner workings. Here is an example for MvcMiniProfiler documentation about how to start:

public static DbConnection GetOpenConnection()
    var cnn = CreateRealConnection(); // A SqlConnection, SqliteConnection ... or whatever

    // wrap the connection with a profiling connection that tracks timings 
    return MvcMiniProfiler.Data.ProfiledDbConnection.Get(cnn, MiniProfiler.Current);

If client calls DbConnection.CreateCommand on an ProfiledDbConnection instance returned from previous method it will get a wrapped whatever command original connection returns and so on. There is also a way to manually create DbCommand through ProfiledDbCommand constructor.

The support for Linq To SQL and Entity Framework is done in a similar manner.

This gets us to the point, why can’t I just use the same approach with LLBLGenPro?

Integrate MvcMiniProfiler with LLBLGenPro – why doesn’t work with same approach

The major problem with LLBLGenPro and MvcMiniProfiler integration is that LLBLGenPro doesn’t use DbConnection.CreateCommand method to create commands from existing connection. Instead it creates an instance of proper DbCommand derived class and assigns a connection to it. Thus it won’t work because it would try to assign a ProfiledDbConnection to a i.e. SqlCommand class.

So a bit more work is required to match them.

The code for adapter scenario

1. Create a DynamicQueryEngine derived class. Note: this class is database specific, thus if you work with i.e. SQL Server you’ll find it in SD.LLBLGen.Pro.DQE.SqlServer.NET20.dll assembly.

public class ProfilingDynamicQueryEngine : DynamicQueryEngine
    protected override DbCommand CreateCommand()
         DbCommand cmd = base.CreateCommand();
         ProfiledDbCommand pCmd = new ProfiledDbCommand(cmd, null, MiniProfiler.Current);
         return pCmd;

Here the DbCommand creation is overriden. Note that I wrap the original cmd and pass a current MiniProfiler instance as arguments to ProfiledDbCommand constructor, while I pass a null for the connection instance because it will be assigned later.

2. Derive from DataAccessAdapter class. Note: this class is generated from a template and you’ll find it in DBSpecificLayer project generated by LLBLGenPro.

public class DataAccessAdapterEx: DataAccessAdapter
    protected override System.Data.Common.DbConnection CreateNewPhysicalConnection(string connectionString)
        DbConnection conn = base.CreateNewPhysicalConnection(connectionString);
        // return ProfiledDbConnection.Get(conn); Pre MvcMiniProfiler 1.9
        return new ProfiledDbConnection(conn, MiniProfiler.Current);
} protected override DynamicQueryEngineBase CreateDynamicQueryEngine() { return PostProcessNewDynamicQueryEngine(new ProfilingDynamicQueryEngine()); } }

Within CreateDynamicQueryEngine I pass the class I’ve created in step #1. CreateNewPhysicalConnection will return a wrapped connection.

Instead of using DataAccessAdapter you should use the one created in step #2 - DataAccessAdapterEx. That’s it.


As it turns out, integrating MvcMiniProfiler with LLBLGenPro is quite easy. And the required coding might be added to LLBLGenPro templates by modifying them, so you won’t have to manually add the same code each time.

Let me know if you have feedback.

Update 19.9.2011: Updated the code because MvcMiniProiler introduced a breaking change in v1.9 (instead of ProfiledDbConnection.Get static method a constructor has to be used - thanks for swift response from David from LLBLGenPro support team)

A workaround to a problem when upgrading BlogEngine from 2.0 to 2.5

During the BlogEngine upgrade from 2.0 to 2.5 (one that hosts this blog) I’ve come across a problem. The problem might happen during execution of the SQL Server upgrade scripts that come with BlogEngine 2.5.

After running the scripts I’ve got an error mentioning that constraint FK_be_PostComment_be_Posts can’t be enforced. Huh? After some experimenting I’ve seen that the 2.0 database isn’t exactly well enforced with constraints and I had some comments left that don’t belong to any of the posts (I guess I’ve deleted the posts but comments were still there because database didn’t enforce the constraints and BlogEngine didn’t delete them).

Here is what I’ve did to upgrade.

1. In the upgrade script comment this statement:

       1: ALTER TABLE dbo.be_PostComment
       2:   ADD CONSTRAINT FK_be_PostComment_be_Posts FOREIGN KEY (BlogID, PostID) REFERENCES dbo.be_Posts (BlogID, PostID)
       3: GO

2. Run the upgrade script.

3. (Optional) To find if any comment is parentless execute this SELECT statement

   1: SELECT * FROM dbo.be_PostComment WHERE postid NOT IN (SELECT postid FROM be_posts)

4. Delete the orphaned comments

   1: DELETE FROM dbo.be_PostComment WHERE postid NOT IN (SELECT postid FROM be_posts)

5. Run the statement from upgrade script that you’ve commented in #1.

That’s it. I guess you could delete orphaned comments even before running the upgrade script and thus avoid first and last step.

Observation: Looks like at least database in version 2.0 wasn’t very well enforced, hopefully 2.5 rectifies this problem (it adds constrains here and there). Don’t forget, database is the last defense tier against bad data and should be as much protected as it can be.