A workaround to a problem when upgrading BlogEngine from 2.0 to 2.5

During the BlogEngine upgrade from 2.0 to 2.5 (one that hosts this blog) I’ve come across a problem. The problem might happen during execution of the SQL Server upgrade scripts that come with BlogEngine 2.5.

After running the scripts I’ve got an error mentioning that constraint FK_be_PostComment_be_Posts can’t be enforced. Huh? After some experimenting I’ve seen that the 2.0 database isn’t exactly well enforced with constraints and I had some comments left that don’t belong to any of the posts (I guess I’ve deleted the posts but comments were still there because database didn’t enforce the constraints and BlogEngine didn’t delete them).

Here is what I’ve did to upgrade.

1. In the upgrade script comment this statement:

       1: ALTER TABLE dbo.be_PostComment
       2:   ADD CONSTRAINT FK_be_PostComment_be_Posts FOREIGN KEY (BlogID, PostID) REFERENCES dbo.be_Posts (BlogID, PostID)
       3: GO

2. Run the upgrade script.

3. (Optional) To find if any comment is parentless execute this SELECT statement

   1: SELECT * FROM dbo.be_PostComment WHERE postid NOT IN (SELECT postid FROM be_posts)

4. Delete the orphaned comments

   1: DELETE FROM dbo.be_PostComment WHERE postid NOT IN (SELECT postid FROM be_posts)

5. Run the statement from upgrade script that you’ve commented in #1.

That’s it. I guess you could delete orphaned comments even before running the upgrade script and thus avoid first and last step.

Observation: Looks like at least database in version 2.0 wasn’t very well enforced, hopefully 2.5 rectifies this problem (it adds constrains here and there). Don’t forget, database is the last defense tier against bad data and should be as much protected as it can be.

Windows Home Server 2011 restore is just horribly flawed

I am a long time user of Windows Home Server (first version) and it saved me quite a lot times. Its restore had only one annoying flaw – that is it didn’t properly recognized my Realtek integrated netwok card (instead it tried to use drivers for another version and didn’t let me to select proper ones). So in order to restore I had to put in a recognizable network card and only then it worked. I guess a bit of hardware work here and there isn’t that bad for my strength after all.

Here comes Windows Home Server 2011 and my first restore experience. Initially it started very well, network card was properly recognized and I was happy. But then, oh well. I successfully connected to server and all went well until I had to select the partitions to restore. For beginning the partition letters were mixed up. Luckily I recognized the partitions by their names and their size. I’ve picked only the system partition to restore. When it should start restoring it worked for a minute and then yielded an unknown error has occurred. Gulb. After looking at the end of its log file it was saying that it can’t lock the volume for reason 5. So how come it can’t lock an unused partition. After googling I discovered that I am not the only one with this locking issue (https://connect.microsoft.com/WindowsHomeServer/feedback/details/665345/unable-to-restore-windows-7-client-with-raid-0). The workaround: delete the partition, create the partition and don’t format it. So I did. However, previously the partition was exactly 150,000 MB and now it shrunk to 149,999 MB. One MB shouldn’t make the difference, should it – the partition wasn’t full to the last byte? It turns out that WHS restore is so dumb that it will refuse the restore to this new partition due to the missing MB even though partition was used like 60%. And there is no way to make it bigger because next to this partition is another one and I don’t want to delete is as well. Very stupid and worse than WHS v1 was. Much worse.

So here is my plan now: attach an external disk, restore there, boot from external disk, manually copy the system partition files to original shrunk partition, make it bootable and run as usual. Will I succeed? I certainly hope so.

I am very disappointed in WHS 2011. The half of the key features (backup, restore) is seriously flawed.

Update: Even a non formatted partition can't be locked for some reason. I guess it has something to do with Intel Rapid Storage since the partition in question is a RAID1 one. Don't forget, it worked just fine with previous WHS.


Here is the final workaround after a day of trial and errors:

  1. Restore system partition using WHS recovery CD to other (external) disk (I've mounted a spare disk in a Sharkoon dock).
  2. (optional) Again restore system partition to the disk above or some other disk, just to have a untouched copy.
  3. Boot from other disk - system is now bootable but not from original disk.
  4. Delete the original boot partition (the one that WHS can't restore).
  5. Install Acronis Disk Director 11 Home (it isn't free but I bought it just for this purpose and it is well worth it) and copy partition from step 2 (or the from step 1) to the original location (to replace the one you deleted in step 4). This software is required to copy partitions - I guess any partition copy software would do.
  6. Reset and boot the original system.

In case either other disk or original disk isn't bootable anymore you might do one of these:

  1. Boot from Windows 7 CD and try "Startup Repair".
  2. Boot from Windows 7 CD and do the "Custom Installation" on that partition. That will make partition bootable. Once it is bootable repeat the process described above.

I also think I know why WHS can't lock my original partition. It is most probably because it is located on Intel Rapid Storage RAID1. Note that WHS v1 didn't have any problem whatsoever with the same configuration. Bad bad WHS 2011.

I lost a working day and 24€ but at least I restored my workstation. Next time it will be much faster.

Let me know if it works for you.

Update: Here is the issue on connect (thanks to Henk Panneman for re-finding it).

Update v2: Issue disappeared again. Oh well, tired to fix the link again and again....

The long path of installing Windows Home Server 2011 under Hyper-V R2

Here is my experience of Windows Home Server 2011 aka Vail installation under Hyper-V R2 server (Core2 Duo E7600, 8GB RAM, 500GB RAID 1 and 3TB RAID5). What could have gone wrong it actually went but let’s go by steps.

  1. During file copy at the very beginning I was experiencing couldn’t copy file XY. Problem: corrupt ISO file I was using. Solution: re-download the file.
  2. The setup make it further but I started experiencing random reboots and BSODs during install, this time it was happening soon after initial files copy finished. On the bright side I caught few of them and they were mostly mentioning memory corruption. Time for memtest86+ and for RAM test. It turned out that one of four Patriot DDR2 2GB CAS6 memory sticks was bad. Solution: Throw out the problematic stick and run the server with three sticks. Also donated to memtest86+, well deserved.
  3. Memory issues were still present during setup. Argh.  Solution: Throw out the third memory stick (they like to work in pair it seems and a pair and a third wheel obviously isn’t something one should use) and replaced it with two older 1GB sticks I had collecting dust.
  4. I’ve made it to the step when setup says “Waiting for installation to continue” and shows a marquee progress bar. Except it didn’t finish. Ever. Now what. After peeking into log files located at C:\Users\All Users\Microsoft\Windows Server\Logs I’ve more or less soon understood that it has something to do with the “waiting for a web page to show”. After more digging I’ve found out that there were problems connecting to the internet and the internal web page wasn’t showing. Problem: the network card didn’t get an address from my DHCP for some reason. Solution: I’ve set a fixed IP and DNS records.
    Hint: Type Ctrl+Alt+End to simulate Ctrl+Alt+Del from Hyper-V client. Pick Start Task Manager and File/Run. Run explorer.exe and you can browse around the file system.

It took me a couple of days to make it through but at least I did it. Hura. It took me that much because I was installing on a 1TB VHD disk on a RAID5 array and setup takes time each retry.

So, that’s it. Now it looks fine.

Automating the Righthand Dataset Visualizer build process and a refresh build

Since the last version released I become aware of an issue in the visualizer. If you were working on a project that referenced newer DevExpress assemblies the visualizer might have reported an exception due to the binary incompatibility with its references to the DevExpress assemblies - a assembly binding issue.

(If you want just the binaries you can skip to the end of the article and download them.)

The solution is to use a custom build of DevExpress assemblies. If you have their sources you can build your custom DevExpress assemblies using these useful scripts.

Then I have to use those custom built assemblies with my visualizer but I want to use them only for release build, not for debug. So I created a folder named CustomAssemblies under visualizer solution. I added a reference path to this folder to all of the visualizer projects. Which means that the MSBuild will use assemblies in this folder if they are present or ones from GAC (the original ones) if the folder is empty. Unfortunatelly the reference paths are global to the project and you can't have two different sets for two different configurations.

So the building of the release version looks like: populate CustomAssemblies folder with custom DevExpress assemblies, run MSBuild on Release configuration and at the end clear the CustomAssemblies folder so the debug version works with the original DevExpress assemblies. But there is one more obstacle. The license.licx file lists public key of the DevExpress assemblies and it doesn't match the one found in custom version. So I have to replace all occurences of the original public key with my custom version public key before the build and restore the originals after the build. Problems solved.

The actual release process involves also {SmartAssembly} which merges all assemblies into a single file and signing it with a certificate+timestamping and finally zipping the rather large result. Because I am not a masochist I decided to create a FinalBuilder project that does all of this automatically for me (except for building custom DevExpress assemblies).

Let me know if there are still problems!

Righthand.DebuggerVisualizer.Dataset.2008_v1.0.1.zip (12.67 mb)

Righthand.DebuggerVisualizer.Dataset.2010_v1.0.1.zip (12.67 mb)

Read more about Righthand DataSet Visualizer here.

About strongtyped datasets, DBNulls, WPF and binding

The mix of strongtyped datasets, DBNull values, WPF and bindings will likely yield exceptions of invalid cast type when binding nullable columns.

Imagine you have a table with a single, nullable, column Description. You would bind it against a TextBox (or any other control) with this syntax: {Binding Description}. When binding is performed against a row that has a DBNull value (remember, ADO.NET uses DBNull.Value for null values) in that column you get an Unable to cast object of type 'System.DBNull' to type 'System.String' exception.

The reason for that exception is quite simple. Let’s take a look at how ADO.NET implements the strongtyped Description column:

   1: [global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
   2: [global::System.CodeDom.Compiler.GeneratedCodeAttribute("System.Data.Design.TypedDataSetGenerator", "")]
   3: public bool IsDescriptionNull() {
   4:     return this.IsNull(this.tableXXX.DescriptionColumn);
   5: }
   7: [global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
   8: [global::System.CodeDom.Compiler.GeneratedCodeAttribute("System.Data.Design.TypedDataSetGenerator", "")]
   9: public string Description {
  10:     get {
  11:         try {
  12:             return ((string)(this[this.tableXXX.DescriptionColumn]));
  13:         }
  14:         catch (global::System.InvalidCastException e) {
  15:             throw new global::System.Data.StrongTypingException("The value for column \'Description\' in table \'XXX\' is DBNull.", e);
  16:         }
  17:     }
  18:     set {
  19:         this[this.tableXXX.DescriptionColumn] = value;
  20:     }
  21: }

The obvious problem is that WPF binding goes straight to Description property without checking IsDescriptionNull() method (as your code should) and then property getter fails to convert DBNull.Value to a string. And no, at this point even a converter won’t help you because the value is read before converter is even invoked.

But fear not, the solution is a very simple one. Just add square brackets around Description in binding syntax, like this: {Binding [Description]}. Looks similar but it is different. The former syntax uses strongtyped property to access the data while the later uses the DataRow’s this[string] accessor instead which returns an object and thus no exceptions are raised due to the DBNull.Value to string conversion. Furthermore a DBNullConverter value converter can come handy as well:

   1: public class DBNullConverter: IValueConverter
   2: {
   3:     public object Convert(object value, Type targetType, object parameter, CultureInfo culture)
   4:     {
   5:         if (value == DBNull.Value)
   6:             return DependencyProperty.UnsetValue;
   7:         else
   8:             // return base.Convert(value, targetType, parameter, culture);
   9:             return value;
  10:     }
  12:     public object ConvertBack(object value, Type targetType, object parameter, CultureInfo culture)
  13:     {
  14:         if (value == null)
  15:             return DBNull.Value;
  16:         else
  17:             // return base.ConvertBack(value, targetType, parameter, culture);
  18:             return value;
  19:     }
  20: }

Migrating a Windows Home Server guest machine from VMWare Virtual Server 2.x to Hyper-V

I was running a Windows Home Server under VMWare Virtual Server as a guest machine. It had a dedicated 750GB hard disk to host a fixed size virtual disk spanning entire hard disk.

These days I am migrating this and other virtual machine to the Hyper-V 2008 R2 free server and here is how I did migrate it.

  1. Uninstall Virtual Server Tools from guest machine (this is important at this point in time because later it can’t be done easily through add/remove programs).
  2. Shut down guest.
  3. Copy all VMDK files to a spare (new) 1.5 TB Seagate disk. This step isn’t strictly necessary but it was for me because the source disk had troubles reading some sectors – if I wanted to proceed I had to have all the files on a good disk. This step took something like 4 hours over 1Gb LAN.
  4. Download and run VMDK to VHD Converter.
  5. Convert VMDK files (as input select the one without numbers if your virtual disk is partitioned across many files, i.e. SomeDisk.vmdk). I converted to the same hard disk (it barely fits) and it took something like 7 hours.
  6. Copy the resulting VHD to the Hyper-V server (I could pick the server as target location in step 5. but I felt more comfortable doing conversion locally). This step again took something like 4 hours.
  7. Create a new Virtual Machine on Hyper-V server, attach the resulting VHD file as its disk.
  8. Run the machine, activate OS (it will detect “huge” hardware change and it will require activation).
  9. Install Integration Services (Action/Insert Integration Services Setup Disk on connection window) and that’s it.

Lessons learned:

  • Hard disks are growing fast in size but network speed doesn’t. Thus such transfers will be slower and slower due to the sheer amount of data to transfer between disks.
  • Such operation might take whole day
  • If you use an external disk like I am then you should really stick with e-Sata instead of USB 2 or firmware (it is up to 4x faster)
  • Have enough free space on hard disks