Category Archives: Uncategorized

Updated blog server to BlogEngine.NET 3.1

Last night I upgraded this blog server to BlogEngine.NET 3.1. I used the new built in automated update tool, in an offline backup copy of course.

It did most of the job without any issues. The only extra things I needed to do was

  • Removed a <add name=”XmlRoleProvider” …> entry in the web.config. I have had to do this before on every install.
  • Run the SortOrderUpdate.sql script to add the missing column and index (see issue 12543)

Once done and tested locally I upload the tested site to my production server. Just a point to note, that the upgrade creates some backup ZIPs of your site before the upgrade, you don’t need to copy these around as they are large.


Source: Rfennell

Swapping the the Word template in a VSTO project

We have recently swapped the Word template we use to make sure all our proposals and other documents are consistent. The changes are all cosmetic, fonts, footers etc. to match our new website; it still makes use of the same VSTO automation to do much of the work. The problem was I needed to swap the .DOTX file within the VSTO Word Add-in project, we had not been editing the old DOTX template in the project, but had created a new one based on a copy outside of Visual Studio.

To swap in the new .DOTX file for the VSTO project I had to…

  • Copy the new TEMPLATE2014.DOTX file to project folder
  • Opened the VSTO Word add-in .CSPROJ file on a text editor and replaced all the occurrences of the old template name with the new e.g. TEMPLATE.DOTX for TEMPLATE2014.DOTX
  • Reload the project in Visual Studio 2013, should be no errors and the new template is listed in place of the old
  • However, when I tried to compile the project I got a DocumentAlreadyCustomizedException. I did not know, but the template in the VSTO project needs to a copy with no association with any VSTO automation. The automation links are applied during the build process, makes sense when you think about it. As we had edited a copy of our old template, outside of Visual Studio, our copy already had the old automation links embedded. These needed to be removed, the fix was to
    • Open the .DOTX file in Word
    • On the File menu > Info > Right click on Properties (top right) to get the advanced list of properties

      image
    • Delete the _AssemblyName and _AssemblyLocation custom properties
    • Save the template
    • Open the VSTO project in Visual Studio and the you should be able to build the project
  • The only other thing I had to do was make sure my VSTO project was the start-up project for the solution. Once this was done I could F5/Debug the template VSTO combination

Source: Rfennell

Using MSDEPLOY from Release Management to deploy Azure web sites

Whilst developing our new set of websites we have been using MSDeploy to package up the websites for deployment to test and production Azure accounts. These deployments were being triggered directly using Visual Studio. Now we know this is not best practice, you don’t want developers shipping to production from their development PCs, so I have been getting around to migrating these projects to Release Management.

I wanted to minimise change, as we like MSDeploy, I just wanted to pass the extra parameters to allow a remote deployment as opposed to a local one using the built in WebDeploy component in Release Management

To do this I created a new component based on the WebDeploy tool. I then altered the arguments to

__WebAppName__.deploy.cmd /y /m:__PublishUrl__" -allowUntrusted /u:"__PublishUser__" /p:"__PublishPassword__" /a:Basic

image


With these three extra publish parameters I can target the deployment to an Azure instance, assuming WebDeploy is installed on the VM running the Release Management deployment client.


The required values for these parameters can be obtained from the .PublishSettings you download from your Azure web site’s management page. If you open this file in a text editor you can read the values need (bits you need are highlighted in yellow)


<publishData><publishProfile profileName=”SomeSite – Web Deploy” publishMethod=”MSDeploy” publishUrl=”somesite.scm.azurewebsites.net:443” msdeploySite=”SomeSite” userName=”$SomeSite” userPWD=”m1234567890abcdefghijklmnopqrstu” destinationAppUrl=http://somesite.azurewebsites.net SQLServerDBConnectionString=”” mySQLDBConnectionString=”” hostingProviderForumLink=”” controlPanelLink=”http://windows.azure.com”><databases/></publishProfile><publishProfile profileName=”SomeSite- FTP” publishMethod=”FTP” publishUrl=ftp://site.ftp.azurewebsites.windows.net/site/wwwroot ftpPassiveMode=”True” userName=”SomeSite$SomeSite” userPWD=”m1234567890abcdefghijklmnopqrstu” destinationAppUrl=http://somesite.azurewebsites.net SQLServerDBConnectionString=”” mySQLDBConnectionString=”” hostingProviderForumLink=”” controlPanelLink=”http://windows.azure.com”><databases/></publishProfile></publishData>


These values are used as follows


  • WebAppName – This is the name of the MSDeploy package, this is exactly the same as a standard WebDeploy component.
  • PublishUrl – We need to add the https and the .axd to the start and end of the url e.g: https://somesite.scm.azurewebsites.net:443/MsDeploy.axd
  • PublishUser – e.g:  $SomeSite
  • PublishPassword – This is set as an encrypted parameter  so it cannot be viewed in the Release Management client e.g: m1234567890abcdefghijklmnopqrstu

On top of these parameters, we can still pass in extra parameters to transform the web.config using the setparameters.xml file as detailed in this other post  this allowing to complete the steps we need to do the configuration for the various environments in our pipeline.


Source: Rfennell

Moving our BlogEngine.NET server to Azure

As part of our IT refresh we have decided to move this BlogEngine.NET server from a Hyper-V VM in our office to an Azure website.

BlogEngine.NET is now a gallery item for Azure website, so a few clicks and your should be up and running.

image

However, if you want to use SQL as opposed to XML as the datastore you need to do a bit more work. This process is well documented in the video ‘Set BlogEngine.NET to use SQL provider in Azure’, but we found we needed to perform some extra steps due to where our DB was coming from.

Database Fixes

The main issue was that our on premises installation of BlogEngine.NET used a SQL 2012 availability group. This amongst other things, adds some extra settings that stop the ‘Deploy Database to Azure’ feature in SQL Management Studio from working. To address these issues I did the following:

Took a SQL backup of the DB from our production server and restored it to a local SQL 2012 Standard edition. I then tried the  Deploy to Azure

image

But got the errors I was expecting

image

There were three types

Error SQL71564: Element User: [BLACKMARBLEAUser] has an unsupported property AuthenticationType set and is not supported when used as part of a data package.
Error SQL71564: Element Column: [dbo].[be_Categories].[CategoryID] has an unsupported property IsRowGuidColumn set and is not supported when used as part of a data package.
Error SQL71564: Table Table: [dbo].[be_CustomFields] does not have a clustered index.  Clustered indexes are required for inserting data in this version of SQL Server.

The first fixed by simply deleting the listed users in SQL Management Studio or via the query

DROP USER [BLACKMARBLEAuser]

The second were addressed by removing the  ‘IsRowGuidColumn’  property in Management Studio

image

or via the query

ALTER TABLE dbo.be_Categories SET (LOCK_ESCALATION = TABLE)

Finally II had to replace the non-cluster index with a cluster one. I got the required definition form the setup folder of our BlogEngine.NET installation, and ran the command

DROP INDEX [idx_be_CustomType_ObjectId_BlogId_Key] ON [dbo].[be_CustomFields]

CREATE CLUSTERED INDEX [idx_be_CustomType_ObjectId_BlogId_Key] ON [dbo].[be_CustomFields]
(
    [CustomType] ASC,
    [ObjectId] ASC,
    [BlogId] ASC,
    [Key] ASC
)

Once all this was done in Management Studio I could Deploy DB to Azure, so after a minute or two had a BlogEngine.NET DB on Azure

Azure SQL Login

The new DB did not have user accounts associated with it. So I had to create one

On the SQL server’s on Master  DB I ran

CREATE LOGIN usrBlog WITH password='a_password';

And then on the new DB I ran

CREATE USER usrBlog FROM LOGIN usrBlog ;
EXEC sp_addrolemember N'db_owner', usrBlog

Azure Website

At this point we could have created a new Azure website using the BlogEngine.NET template in the gallery. However, I chose to create an empty site as our version of BlogEngine.NET (3.x) is newer than the version in the Azure gallery (2.9).

Due to the history of our blog server we have a non-default structure, the BlogEngine.NET code is not in the root. We retain some folders with redirection to allow old URLs to still work. So via an FTP client we create the following structure, copying up the content from our on premises server

  • sitewwwroot  – the root site, we have a redirect here to the blogs folder
  • sitewwwrootbm-bloggers – again a redirect to the blogs folder, dating back to our first shared blog
  • sitewwwrootblogs – our actual server, this needs to be a virtual application

    Next I set the virtual application on the Configure section for the new website, right at the bottom, of the page

    image

    At this point I was back in line with the video, so need to link our web site to the DB. This is done using the link button on the Azure  web site’s management page. I entered the new credentials for the new SQL DB and the DB and web site were linked. I could then get the connection string for the DB and enter it into the web.config.


  • Unlike  in the video the only edit I need to make was to the connection string, as all the other edits had already been made for the on premises SQL


    Once the revised web.config was uploaded the site started up, and you should be seeing it now


    Source: Rfennell

    Publishing more than one Azure Cloud Service as part of a TFS build

    Using the process in my previous post you can get a TFS build to create the .CSCFG and .CSPKG files needed to publish a Cloud Service. However, you hit a problem if your solution contains more that one Cloud Service project; as opposed to a single cloud service project with multiple roles, which is not a problem.

    The method outlined in the previous post drops the two files into a Packages folder under the drops location. The .CSPKG files are fine, as they have unique names. However there is only one ServiceConfiguration.cscfg, whichever one was created last.

    Looking in the cloud service projects I could find no way to rename the ServiceConfiguration file. It looks like it is like a app.config or web.config file i.e. it’s name is hard coded.

    The only solution I could find was to add a custom target that is set to run after the publish target. This was added to the end of each .CCPROJ files using a text editor just before the closing </project>

     <Target Name="CustomPostPublishActions" AfterTargets="Publish">
        <Exec Command="IF '$(BuildingInsideVisualStudio)'=='true' exit 0
        echo Post-PUBLISH event: Active configuration is: $(ConfigurationName) renaming the .cscfg file to avoid name clashes
        echo Renaming the .CSCFG file to match the project name $(ProjectName).cscfg
        ren $(OutDir)PackagesServiceConfiguration.*.cscfg $(ProjectName).cscfg
        " />
      </Target>
       <PropertyGroup>
        <PostBuildEvent>echo NOTE: This project has a post publish event</PostBuildEvent>
      </PropertyGroup>

     


    Using this I now get unique name for the .CSCFG files as well as for .CSPKG files in my drops location. All ready for Release Management to pickup


    Notes:


    • I echo out a message in the post build event too just as a reminder that I have added a custom target that cannot be seen in Visual Studio, so is hard to discover
    • I use an if test to make sure the commands are only run on the TFS build box, not on a local build. The main reason for this is the path names are different for local builds as opposed to TFS build. If you do want a rename on a local build you need to change the $(OutDir)Packages path to $(OutDir)app.publish. However, it seemed more sensible to leave the default behaviour occur when running locally

    Source: Rfennell

    Deploying a Windows service with Release Management

    I recently needed to deploy a Windows service as part of a Release Management pipeline. In the past, our internal systems I have only need to deploy DB (via SSDT Dacpacs) and Websites (via MSDeploy), so a new experience.

    WIX Contents

    The first step to to create an MSI installer for the service. This was done using WIX, with all the fun that usually entails. The key part was a component to do the actual registration and starting of the service

    <Component Id ="ModuleHostInstall" Guid="{3DF13451-6A04-4B62-AFCB-731A572C12C9}" Win64="yes">
       <CreateFolder />
       <Util:User Id="ModuleHostServiceUser" CreateUser="no" Name="[SERVICEUSER]" Password="[PASSWORD]" LogonAsService="yes" />
       <File Id="CandyModuleHostService" Name ="DataFeed.ModuleHost.exe" Source="$(var.ModuleHost.TargetDir)ModuleHost.exe" KeyPath="yes" Vital="yes"/>
       <ServiceInstall Id="CandyModuleHostService" Name ="ModuleHost" DisplayName="Candy Module Host" Start="auto" ErrorControl="normal" Type="ownProcess"  Account="[SERVICEUSER]" Password="[PASSWORD]" Description="Manages the deployment of Candy modules" />
       <ServiceControl Id="CandyModuleHostServiceControl" Name="ModuleHost" Start="install" Stop="both" Wait="yes" Remove="uninstall"/>

    So nothing that special here, but worth remembering if you miss out the ServiceControl block the service will not automatically start or be uninstalled with the MSI’s uninstall


    You can see that we pass in the service account to be used to run the service as a property. This is an important technique for using WIX with Release Management, you will want to be able to pass in anything you may want to change as installation time as a parameter. This means we ended up with a good few properties such as

      <Property Id="DBSERVER" Value=".sqlexpress" />
      <Property Id="DBNAME" Value ="=CandyDB" />
      <Property Id="SERVICEUSER" Value="Domainserviceuser" />
      <Property Id="PASSWORD" Value="Password1" />

    These tended to equate to app.config settings. In all cases I tried to set sensible default values so in most cases I could avoid passing in an override value.


    These property values were then used to re-write the app.config file after the copying of the files from the MSI onto the target server. This was done using the XMLFile tools and some XPath e.g.

    <Util:XmlFile Id="CacheDatabaseName" 
    Action="setValue"
    Permanent="yes"
    File="[#ModuleHost.exe.config]"
    ElementPath="/configuration/applicationSettings/DataFeed.Properties.Settings/setting[[]@name='CacheDatabaseName'[]]/value" Value="[CACHEDATABASENAME]" Sequence="1" />
     

    Command Line Testing


    Once the MSI was built it could be tested from the command line using the form

    msiexec /i Installer.msi /Lv msi.log SERVICEUSER="domainsvc_acc" PASSWORD="Password1" DBSERVER="dbserver" DBSERVER="myDB" …..

    I soon spotted a problem. As I was equating properties with app.config settings I was passing in connections strings and URLs, so the command line got long very quickly. It was really unwieldy to handle


    A check of the log file I was creating, msi.log, showed the command line seemed to be truncated. This seemed to occur around 1000 characters. I am not sure if this was an artefact of the logging or the command line, but either way a good reason to try to shorten the property list.


    I  therefore decided that I would not pass in whole connection strings, but just the properties that might change, especially effective for connection strings to things such as Entity Framework. This meant I did some string building in WIX during the transformation of the app.config file e.g.

    <Util:XmlFile Id='CandyManagementEntities1'
       Action='setValue'
       ElementPath='/configuration/connectionStrings/add[[]@name="MyManagementEntities"[]]/@connectionString'
       File='[#ModuleHost.exe.config]' Value='metadata=res://*/MyEntities.csdl|res://*/MyEntities.ssdl|res://*/MyEntities.msl;provider=System.Data.SqlClient;provider connection string=&quot;data source=[DBSERVER];initial catalog=[DBNAME];integrated security=True;MultipleActiveResultSets=True;App=EntityFramework&quot;' />

    This technique had another couple of advantages


    • It meant I did not need to worry over spaces in strings, I could therefore lose the “ in the command line – Turns out this is really important later.
    • As I was passing in just a ‘secret value’ as opposed to a whole URL I could use the encryption features of Release Management to hide certain values

    It is at this point I was delayed for a long time. You have to be really careful when installing Windows services via an MSI that your service can actually start. If it cannot then you will get errors saying “… could not be installed. Verify that you have sufficient privileges to install system services”. This is probably not really a rights issue, just that some configuration setting is wrong so the service has failed to start. In my case it was down to an incorrect connection string, stray commas and quotes, and a missing DLL that should have been in the installer. You often end up working fairly blind at this point as Windows services don’t give too much information when they fail to load. Persistence, SysInternals Tools and comparing to the settings/files on a working development PC are the best options


    Release Management Component


    Once I had working command line I could create a component in Release Management. On the Configure Apps > Components page I already had a MDI Deployer, but this did not expose any properties. I therefore copied this component to create a MSI deployer specific to my new service installer and started to edit it.


    All the edits were on the deployment tab, adding the extra properties that could be configured.


    image


    Note: Now it might be possible to do something with the pre/post deployment configuration variables as we do with MSDeploy, allowing the MSI to run then editing the app.config later. However, given that MSI service installers tends to fail they cannot start the new service I think passing in the correct properties into MSIEXEC is a better option. Also means it is consistent for anyone using the MSI via the command line.


    On the Deployment tab I changed the Arguments to

    -File ./msiexec.ps1 -MsiFileName "__Installer__"  -MsiCustomArgs ‘SERVICEUSER=”__SERVICEUSER__”  PASSWORD=”__PASSWORD__” DBSERVER=”__DBSERVER__”  DBNAME=”__DBNAME__” …. ’

    I had initially assumed I needed the quotes around property values. Turns out I didn’t, and due to the way Release Management runs the component they made matters much, much worse. MSIEXEC kept failing instantly. if I ran the command line by hand on the target machine it was actually showing the Help dialog, so I knew the command line was invalid.


    Turns out the issue is Release Management calls PowerShell.EXE to run the script passing in the Arguments. This in turn calls a PowerShell Script which does some argument processing before running a process to run MSIEXEC.EXE with some parameters. You can see there are loads of places where the escaping and quotes around parameters could get confused.


    After much fiddling, swapping ‘ for “ I realised I could just forget most of the quotes. I had already edited my WIX package to build complex strings, so the actual values were simple with no spaces. Hence my command line became

    -File ./msiexec.ps1 -MsiFileName "__Installer__"  -MsiCustomArgs “SERVICEUSER=__SERVICEUSER__  PASSWORD=__PASSWORD__ DBSERVER=__DBSERVER__  DBNAME=__DBNAME__ …. “

    Once this was set my release pipeline worked resulting in a system with DBs, web services and window service all up and running.


    As is often the case it took a while to get this first MSI running, but I am sure the next one will be much easier.


    Source: Rfennell

    PowerShell Summit Europe 2014

    I find I am spending more time with PowerShell these days, as we aim to automated more of our releases and specifically with DSC in PowerShell 4, as I am sure many of us are

    Give that fact, the PowerShell Summit Europe 2014 at the end of the month looks interesting. I only found out about it too late and I have diary clashes but might be of interest to some of you. Looks like a really good hands event.


    Source: Rfennell

    Got around to updating my Nokia 820 to WP81 Update 1

    I had been suffering with the 0x80188308 error when I tried to update my Nokia 820 to the WP81 Update 1 because I had the developer preview installed. I had been putting off what appeared to be the only solution of doing a reset as discussed in the forums as it seem a bit drastic, thought I would wait for Microsoft to sort out the process. I got bored waiting..

    Turns out as long as you do the backup first it is fairly painless, took about an hour of uploads and downloads over WiFi

    1. Created a manual backup of the phone: Settings>backup>apps+settings>backup now.
    2. Reset the phone to factory settings (DP 8.1), leaving any SD card alone: Settings>about>reset your phone.
    3. When prompted logged in with the same ID as used for the backup
    4. Restored the phone using the backup just created.  
    5. Reconnected to all of the other accounts and let the phone download all of the apps.
    6. Signed back into the Preview for Developers app – else you won’t see the updates!
    7. The updates comes down without a problem as one large package

    Lets have a go with a UK aware version of Cortana….


    Source: Rfennell

    Got around to updating my Nokia 820 to WP81 Update 1

    I had been suffering with the 0x80188308 error when I tried to update my Nokia 820 to the WP81 Update 1 because I had the developer preview installed. I had been putting off what appeared to be the only solution of doing a reset as discussed in the forums as it seem a bit drastic, thought I would wait for Microsoft to sort out the process. I got bored waiting..

    Turns out as long as you do the backup first it is fairly painless, took about an hour of uploads and downloads over WiFi

    1. Created a manual backup of the phone: Settings>backup>apps+settings>backup now.
    2. Reset the phone to factory settings (DP 8.1), leaving any SD card alone: Settings>about>reset your phone.
    3. When prompted logged in with the same ID as used for the backup
    4. Restored the phone using the backup just created.  
    5. Reconnected to all of the other accounts and let the phone download all of the apps.
    6. Signed back into the Preview for Developers app – else you won’t see the updates!
    7. The updates comes down without a problem as one large package

    Lets have a go with a UK aware version of Cortana….


    Source: Rfennell

    Getting ‘… is not a valid URL’ when using Git TF Clone

    I have been attempting to use the Git TF technique to migrate some content between TFS servers. I needed to move a folder structure that contains spaces in folder names from a TPC that also contains spaces in its name. So I thought my command line would be

    git tf clone “http://tfsserver1:8080/tfs/My Tpc” “$/My Folder”’ oldrepo --deep

    But this gave the error

    git-tf: “http://tfsserver1:8080/tfs/My Tpc” is not a valid URL

    At first I suspected it was the quotes I was using, as I had had problems here before, but swapping from ‘ to “ made no difference.


    The answer was to use the ASCII code %20 for the space, so this version of the command worked

    git tf clone http://tfsserver1:8080/tfs/My%20Tpc “$/My Folder”’ oldrepo --deep

    Interestingly you don’t need to use %20 for the folder name


    Source: Rfennell