Just another Microsoft MVPs site

Month: September 2009

The Savvy-Tech’s Hardware-Independent-Restore

Ok, so I thought I’d share a real-world support scenario that happened to me today:

So I have a new contract customer I just signed a couple weeks ago, and they went live as of 9/1.  I was doing various maintenance tasks on their network over the weekend, removing unnecessary apps from PCs to improve performance, getting patches installed, etc.  So about the only thing that was left last night was patching their Windows 2003 terminal server.  So I push the patches out via Kaseya, patches install successfully, the server initiates a reboot – but never comes back.  Now, I’ve been doing remote patching / reboots for years now, and this has only ever happened a handful of times.

I log in to their SBS and attempt to ping the TS – no response.  The TS is a whitebox server that is about 4 years old and doesn’t have a remote access card or IP-KVM connected.  The client is in bed, and not having the TS really isn’t going to be an issue until their approx half-dozen remote users try to access Great Plains in the morning. So I didn’t bother calling to wake anyone up – instead, I more or less surprised the VP when I walked in at 7:30 this morning to address a problem they didn’t know they had yet. 

Short story was that the server was pretty much on its deathbed – the alarm LED on the case was coming on whenever the processor tried to do anything.  It took 5 attempts before I was able to get the server to boot, and when I did get logged in the CPU was grinding constantly with the error LED lit, but looking at the task manager I didn’t see anything out of the ordinary, besides the fact that the system was so slow it was virtually unusable for one user at the console, let alone a half-dozen plus remote TS users trying to use Great Plains.  Quick diagnosis & gut instinct told me this was a hardware issue.  Being a 4-year old whitebox, it was long out of warranty.  I knew the server just needed replaced, but the remote users couldn’t wait a week to 10 days for me to get approval, get a box ordered from Dell, and get it installed.  Additionally, for the state of the machine, it would probably take days to get an image-based backup using Shadow Protect, since this is a new customer and they aren’t backing up the TS since there’s no data on it.

SO – I ran back to my office and grabbed a spare PC I use for random stuff on my bench (Acer – about 3 yrs old, but has a dual-core Pentium CPU @ 2.8 GHz & has been upgraded to 2GB RAM).  I also grabbed my old Adaptec 1205SA PCI SATA host controller off the shelf and returned to the client.

The TS in question was running a RAID 1 array using an on-board SATA RAID controller.  I shut down the TS, and installed the Adaptec SATA controller in an open PCI slot, then after 4 tries the server finally booted again.  I logged in, the OS found the Adaptec SATA controller & I installed drivers from my thumb drive.  Once the driver installation completed successfully, I shut down the TS again.

I removed the Adaptec SATA controller & drive 0 from the TS.  I installed the Adaptec SATA controller in the Acer PC, inserted & connected drive 0 from the TS to the Adaptec SATA controller, then disconnected the existing SATA hdd in the PC.  I powered-on the PC, and since drive 0 was connected to the Adaptec SATA controller, AND the Win2k3 OS on drive 0 already had drivers for that controller installed, the Win2k3 TS OS booted successfully in (almost) completely different hardware.  On the first login, the OS detected the various new hardware (on-board boot controller, DVD drive, on-board NIC, etc.).  Once drivers for new hardware were installed & onboard NIC configured, I powered down the Acer PC, removed the Adaptec SATA controller card, & connected drive 0 to the on-board primary SATA port.  Powered on the PC – the Win2k3 OS again booted successfully, and we verified that remote users were able to successfully log in and launch Great Plains.

Obviously, using a 3-yr old desktop PC as a terminal server is not a long-term solution.  But – this minimized downtime for the remote users (having them all online before noon), and provided both myself & the customer with valuable breathing room / time to resolve the root issue and get the ball rolling on replacing this server.  And given the small number of users and basic Dynamics GP use, the performance of this temporary hardware is more than sufficient for the remote users (and beats the alternative smile_regular )

And yes, there is more than one way to skin a cat – and multiple ways this problem could have been addressed.  In this particular situation, I felt this was the best approach to get to a working system in the least amount of time possible, considering the severe instability of the original hardware, the lack of an existing image backup of the TS, and the fact that I could easily break the mirror to run off a single HDD from the server.

Migrating your SharePoint blog

As some of you may know, I assist Susan with administering & maintaining the blogs here at msmvps.com.  For various reasons, over the past few months I have become familiar with various approaches to blog migrations – most notably the BlogML project.  As a result, I’ve sort of become the neighborhood go-to guy for moving blogs, including assisting Steve Riley with his move from msinfluentials.com to wordpress.com

A couple weeks ago I was presented with an intriguing request / challenge.  My friend Wayne Small over at sbsfaq.com had been running his site and his blog on SharePoint for several years, but was now in the process of moving everything over to a single Word Press site, and wanted to migrate his content from his SharePoint blog.  The challenge wasn’t so much getting the content in to Word Press, since there are several importers available, including a BlogML importer.  The problem was getting the content out of SharePoint 3.0.  BlogML exporters for most platforms are web based, allowing you to initiate the export from within the blog platform, and download the resulting export file.  While I have coding experience, I don’t have any experience building add-ins for SharePoint and wasn’t about to open that can of worms, so I decided for a different approach.

For those of you who don’t know, there is rather impressive integration between SharePoint 3.0 & Access 2007, so I opted to use Access to extract the information out of Wayne’s old SharePoint blog.  This approach actually gave me more flexibility in meeting the various requirements:

  1. Where the SharePoint blog used Categories, Wayne wanted to use Tags in Word Press.
  2. We wanted to migrate all content – posts, comments, & embedded content (images in posts, etc.)

Moving categories to tags seemed simple enough, however I discovered that the the current 2.0 iteration of BlogML doesn’t support tags (which admittedly surprised me).  As a result, Aaron Lerch’s BlogML import class for Word Press did not support tags either.  Scoring the web, I found that Wayne John had updated Aaron’s BlogML import class to allow for importing tags from Blog Engine exports.  I did a quick & dirty install of Blog Engine on my sandbox server so I could examine its default BlogML export so I could match how it tagged its XML to identify post tags.

One of the major behind-the-scenes differences between SharePoint blogs & Word Press is how embedded content is stored.  When you are composing posts using an offline editor such as Windows Live Writer, inserted images are stored differently in each platform when the post is uploaded & published.  Word Press stores the images in the file system on the site, whereas SharePoint stores the images in the database as attachments to the post record.  Luckily for us, Access 2007 can handle attachments on SharePoint lists natively.  In early test runs, I found that there was some duplication in image names between various posts in Wayne’s blog (especially capture.png).  As a result, I decided to save the attachments for each SharePoint post to a different folder to avoid name collision issues.

So – how does the final solution look? 

  1. I created a new Access 2007 database, and used the External Data functionality to link to the Posts, Comments, & Categories lists on the SharePoint blog site.
  2. I created a simple form that allowed me to enter the path & filename I wanted for the resulting BlogML export file, as well as a path to where I wanted the embedded images from the SharePoint blog stored.  Obviously, the form contained a Start button as well . . .
  3. When the start button was clicked, the code behind the button did all the heavy lifting:
    1. It creates the BlogML export file using the path / filename listed on the form, and writes the various header information.
    2. We open a new recordset containing the Posts table.
      1. We call a helper function to format the post published date how the XML file wants it.
      2. We open a second recordset that contains the attachments for the current post we are processing
      3. If the current post has attachments, then:
        1. We save each attachment to the local file system, using the patch specified on the main form.  To prevent filename collision, we create a new subfolder for each post, using SharePoint’s numeric post ID as the folder name.  (So if we listed C:\export as the folder we wanted to save embedded images to on the main form, we would end up with something like C:\export\<post_id>\capture.png)
        2. For each attachment, we populate the <attachment /> node of the BlogML output.
        3. We parse the body of the current post and replace every path we find pointing to the old attachment path with the new attachment path.  E.g., links to embedded content in the SharePoint blog are referenced via “/Lists/Posts/Attachments/<post_ID>/<filename>” where once we import content in to Word Press, the path will be something like “/wp-content/uploads/<post_ID>/<filename>”  By updating the relative paths to our embedded content during the export, we can better insure that our embedded content will transfer seamlessly.  This isn’t the case with most BlogML exports, because they are simply exporting the content, not updating embedded links to match the target platform.
        4. We cycle through and repeat for each attachment for the current post.
      4. We write the post content to the Output file
      5. For each category listed in the Posts recordset, we write a <Tag /> to the output file (to match Wayne’s requirements). 
      6. We open a third recordset which contains all of the comments for the current post.  If the current post has comments, then:
        1. We call a helper function to format the comment date how the XML file wants it.
        2. We write the current comment to the output file
        3. We cycle through & repeat for each remaining comment for the current post.
      7. We cycle through and repeat for each post
    3. We write the closing tags to the output file and complete the process.

So in the end, we have one BlogML.xml file with all of the blog content (posts, comments, categories, tags, etc.), and a folder that contains all of the exported embedded images from the SharePoint blog posts (in separate sub-folders by post ID).

At this point, Wayne simply had to copy the embedded images subfolders to his /wp-content/uploads folder for his site, then run the BlogML import process to import the content generated.  Voila! 

For example:

Post on SharePoint blog:  SBS 2008 R2 I want it now!

Migrated post on WordPress blog:  SBS 2008 R2 I want it now!

Notice the embedded image is displayed as expected – and if you look at the image properties on the Word Press post, you’ll see it is referencing the relative path to the image on the Word Press site.  In addition, the original post date & author info has been maintained, as have all comments, with their original comment dates & author info.

One caveat that is important to share – when I was first starting to test the import process, the BlogML import in Word Press was failing with an error that invalid characters were encountered on line X at column Y.  However, opening the BlogML export file in notepad, wordpad, or IE – I couldn’t see anything that appeared to be an invalid character.  Finally, opening the BlogML output file in Visual Studio 2008 allowed me to see the invalid characters, and I was able to remove them with a simple find/replace in Visual Studio, after which the import process in Word Press completed successfully.  I’m not sure what caused these characters to be present in the first place – perhaps the authoring tool Wayne used, or perhaps something in the export process, pulling from SharePoint to Access to text. 

But anyway, that was one of my recent side-projects.  Let me know what you think – the Access database I used was rather rough around the edges and relied on a number of assumptions I was able to make which resulted in certain options being hard-coded.  If there is enough interest, I’ll polish it up a bit and post it for others to use to export their SharePoint blogs to BlogML.