Monthly Archives: August 2009

Download IE8 and MSFT will donate $1.15 to Feeding America (charity)

It doesn’t matter what flavour you prefer when you browse the big interweb today, you can still contribute to helping out.


Microsoft is donating $1.15 for each download of IE8 from here – if you’re upgrading from IE6 they’ll double the donation.


The donation is given to Feeding America.


microsoft-logo logo-temp


 


So, contribute in your own way – and it costs you nothing – by clicking here to download IE8.

ADO.NET Data Services v.1.5 CTP2

Astoria team has released next CTP of ADO.NET Data Services 1.5. In comparison with previous release, CTP2 is really big work and we can observe now already that ADO.NET Data Services becomes even more powerful platform. As a whole it is visible, that development of Astoria is directed on more closer integration with Silverlight and .NET Framework 3.5 SP1.

What’s new in CTP2:

  • Projection. It is new feature for Astoria which has appeared only in this CTP. This feature allows to build more difficult projections using ADO.NET Data Services, and is accessible both server functionality, and client.
  • Data Binding. Feature of binding with the user interface in WPF and Silverlight already was in previous CTP, however in new release here has occurred small refactoring and binding began to do even easier.
  • Row count. Has changed nothing, bugs has been fixed.
  • Feed customization. The quantity of elements which can be adjusted is expanded.
  • Server driven paging. Support in client library is added.
  • Enhanced LOB Support. Support in client library is added.
  • Request Pipeline. Now it is possible to handle something in a chain of processing of request using model of events (something similar on model of events ASP.NET). It is necessary for even big possibilities of expansion of functionality of services.
  • New “Service Provider” interface. Refactoring has been executed.

It is besides, informed, that has been fixed a number of errors. Thus, release of ADO.NET Data Services v.1.5 on one CTP became nearly.

Also there was an information that all these possibilities of version 1.5 will be a part.NET Framework 4.

Source: http://blogs.msdn.com/astoriateam/archive/2009/08/31/ado-net-data-services-v1-5-ctp2-now-available-for-download.aspx

Tech.Ed AU 2009 – Sara Ford fan club

A good friend of mine, Mike, is even more of a social butterfly than i am. So, since he’s totally excited about going to Tech.Ed this year (who isn’t???) and because he’s a geek, he kicked off the Sara Ford Fan Club at Tech.Ed AU 2009.


After talking things over a few times it’s been organised to have a Geek Meet-Up on Tuesday evening, where Sara Ford will be attending too – check out the LinkedIn Tech.Ed Australia Group for details:


Tech.Ed Australia (Teched)


Lastly, we talked about getting some buttons made up – and Mike came through with them.


sffc


So, if you, like many many others are fans of Sara Ford seek us out at Tech.Ed and get your own :)

KiXtart 2010 4.61 RC2 Released!

Available on myITforum.com, of course.

Its available for download from the Scripting downloads page:

http://www.myitforum.com/downloads/default.asp?w=3&se=Scripting

 

Or, grab it with this direct download link:

KiXtart 2010 4.61 RC2

[VSTS 2010] Instalación del Team Foundation Server 2010

Nota: El procedimiento de instalación se realizó con el Beta 1 de TFS 2010.

Tal como había ofrecido en Junio en el post de instalación de Visual Studio 2010 y que por razones de tiempo y trabajo me he alejado un poco de mi blog, pero bueno todo ha sido superado exitosamente y me tienen nuevamente escribiendo. Para las personas que vienen utilizando versiones anteriores del Team Foundation Server 2005/2008 la instalación del Team Foundation Server 2010 será un alivio, el proceso de configuración de los diferentes productos es mucho mas transparente, y que puede considerarse como una de las nuevas características del TFS 2010, así como la capacidad de poder definir varios team projects collections, los mismos que puede contener varios team projects, tal como manejamos los sitios en SharePoint, esto facilita las tareas administrativas.

La instalación que hice fue en un ambiente virtualizado con un controlador de dominio en Windows Server 2003, y en otro servidor con Windows Server 2008, SQL Server 2008 y Microsoft Office SharePoint Server 2007.

Previo a la instalación se debe levantar el servicio de IIS, instalar SQL Server 2008 y MOSS 2007.

Instalación del TFS

Primero debemos ejecutar el instalador del Team Foundation Server 2010, aceptar los términos de licencia.

DevSrv-2009-05-26-13-04-56

Seleccionar los componentes que se van a instalar, en este caso seleccionamos todos los componentes:

DevSrv-2009-05-26-13-05-05

Al finalizar la instalación habilite la casilla de verificación para ejecutar la herramienta de configuración del TFS.

DevSrv-2009-05-26-13-37-18

 

Instalación de las extensiones de SharePoint

 DevSrv-2009-05-26-20-21-11

 

 DevSrv-2009-05-26-20-22-43

 

DevSrv-2009-05-26-20-23-05

 

DevSrv-2009-05-26-20-26-30

Configuración del Team Foundation Server

 

De la ventana de configuración seleccionamos configuración personalizada.

 DevSrv-2009-05-26-20-28-57

 

Iniciamos el asistente

DevSrv-2009-05-26-20-10-51

 

Ventana de bienvenida del asistente de configuración personalizada

DevSrv-2009-05-26-20-11-07

Base de Datos de Configuración

Ingresamos el nombre del servidor de base de datos y su instancia.

 DevSrv-2009-05-26-20-11-31

Capa de Aplicación

Configuramos la cuenta de servicio, y el sitio web, puerto y directorio virtual para los servicios del TFS

DevSrv-2009-05-26-20-12-23

 

Configuración de Repoting

Habilitamos la casilla de verificación para configurar reporting services para utilizar TFS

DevSrv-2009-05-26-20-12-38

 

Ingresamos el nombre del servidor de reporting services y selecciona las urls del report server y report manager.

 DevSrv-2009-05-26-20-12-49

 

Ingresa el nombre del servidor que contiene el Analysis Services para el cubo del TFS

DevSrv-2009-05-26-20-12-59 

 

Ingresa la cuenta que utilizará el RS para la generación de los reportes del TFS.

 

 

 

 

 

 

 

 

 

 

 

DevSrv-2009-05-26-20-13-33

 

Habilite configurar sharepoint en este momento.

 DevSrv-2009-05-26-20-13-42

 

Configuramos la url donde se ubicaran los team sites y de la herramienta central de administración

 DevSrv-2009-05-26-20-56-30

 

Creamos un team project collection por defecto

DevSrv-2009-05-26-20-56-39

 

Vista de resumen de la configuración ingresada en los pasos anteriores

DevSrv-2009-05-26-20-56-47

 

Inicio del proceso de configuración

 DevSrv-2009-05-26-20-56-56

Finalización del proceso de configuración

 DevSrv-2009-05-26-21-01-57

    DevSrv-2009-05-26-21-02-20

    DevSrv-2009-05-27-14-25-36

 

Habilitar el servicio de Single Sign On (SSO)

Nota: El servicio de SSO lo debemos configurar para los reportes de Excel Services y la correcta visualización del dashboard.

En SQL Server creamos un login para la cuenta de servicio de SSO.

DevSrv-2009-05-27-12-19-21

 

Le asignamos los respectivos permisos

DevSrv-2009-05-27-12-19-30

 

Desde el administrador de servicios del Server manager configuramos el servicio de Microsoft Single Sign-On

DevSrv-2009-05-27-12-21-08

 

Una vez configurada la cuenta de inicio del servicio, lo iniciamos

DevSrv-2009-05-27-12-21-22

Abrimos la herramienta central de administración del MOSS 2007.

DevSrv-2009-05-27-12-24-10

Verificamos que la cuenta de servicio del SSO este dentro del grupo de administradores de la granja del MOSS 2007 (Para ingresar a la siguiente ventana desde las acciones del sitio debe seleccionar personas y grupos)

DevSrv-2009-05-27-12-24-38

 

Regresamos a la pestaña de operaciones y seleccionamos “Administración de las configuraciones de SSO” de la sección de configuración de seguridad, luego en la siguiente ventana seleccionamos administrar configuraciones del servidor.

DevSrv-2009-05-27-12-28-13

 

Configuramos la cuenta de servicio y el grupo de los usuarios de sso (Nota: la cuenta de inicio de servicio de SSO debe pertenecer a este grupo.)

DevSrv-2009-05-27-12-29-19

Luego respaldamos el encryption key desde la opción de administración de clave de encriptación

DevSrv-2009-05-27-12-34-23

 

Luego creamos una nueva definición haciendo clic sobre “Manage settings for enterprise application definitions”.

DevSrv-2009-05-27-14-09-53

Para finalizar la configuración del SSO debemos ir al proveedor de servicios compartidos al que pertenece la aplicación web que contendrá los team sites y debemos agregar dicha url dentro de los sitios de confianza de excel services. Descargar documento de referencia.

 

Prueba de creación de un Team Project

Nos conectamos desde un equipo cliente con VS 2010 y creamos un Team project (similar a lo realizado en versiones anteriores) y abrimos el team site:

           DevSrv-2009-05-28-12-22-30

Nota: El Team Foundation Web Access está integrado al team site mediante web parts de sharepoint.

Interview with Tim Mintner « The Realm of the Verbal Processor

 

As promised last week, here is the interview I had with Tim Mintner at TechEd North America in Los Angeles.

Interview with Tim Mintner « The Realm of the Verbal Processor

BlogJet 2.0.1.4 Released

 

We are happy to announce a small bug-fix release of BlogJet. It fixes connection issues with some Blogger blogs, and includes an updated extension for Firefox 3.x.

BlogJet 2.0.1.4 Released

Configuring Internet Facing Deployment for Microsoft CRM 4.0

After running the tool to configure the CRM website for an Internet Facing Deployment (see the articles here and here) we got the old familiar Runtime Error: Server error in / Application. We turned on remote errors and saw the underlying error message was even less helpful: “Object reference not set to an instance of an object.”

After some digging, it turns out there was a registry key on the CRM server that incorrectly identified the identifier of the CRM website in IIS.

The ‘website’ registry key at HKEY_LOCAL_MACHINESOFTWAREMicrosoftMSCRM had a value of ‘/LM/W3SVC/1′ and it should have been ‘/LM/W3SVC/5′

You can see the identifier for the CRM website in this screen capture from IIS:




All we had to do is navigate to the registry key and change the value to reflect the identifier that we saw in IIS.

NTBackup Script Procedure

1)      Copy robocopy.exe from the Resource Kit to something like c:backuptools. This is used in step 5 below.


2)      Create a backup list using the GUI version. Save a system state only as night.bks, save all the drives and system state as night-all.bks. Make sure you use the backup directory path listed below in the script or modify to a SAN location.


3)      Use the follow for the required .bat file


C:backuptoolsnightlybackup.bat:


            *** BEGIN BAT FILE ***


@ECHO OFF


REM


REM  Build Date String yyyymmddhhmm


REM


for /F “usebackq tokens=2″ %%i IN (`DATE /T`) DO Set TIMESTR=%%i%


for /F “usebackq tokens=1,2,3 delims=/” %%i IN (‘%TIMESTR%’) DO SET MONTH=%%i%


for /F “usebackq tokens=1,2,3 delims=/” %%i IN (‘%TIMESTR%’) DO SET DAY=%%j%


for /F “usebackq tokens=1,2,3 delims=/” %%i IN (‘%TIMESTR%’) DO SET YEAR=%%k%


for /F “usebackq” %%i IN (`Time /T`) DO Set TIMESTR=%%i%


for /F “usebackq tokens=1,2* delims=:” %%i IN (‘%TIMESTR%’) DO SET HOUR=%%i%


for /F “usebackq tokens=1,2* delims=:” %%i IN (‘%TIMESTR%’) DO SET MINUTE=%%j%


SET DIRSTR=%YEAR%%MONTH%%DAY%%HOUR%%MINUTE%


SET TIMESTR=


SET MONTH=


SET DAY=


SET YEAR=


SET HOUR=


SET MINUTE=


@ECHO Directory String %DIRSTR%


 


IF NOT EXIST “c:backuptoolsnightlybackup.bat” GOTO SKIP


 


C:WINDOWSsystem32ntbackup.exe backup “@\DOMAINCONTROLLERNAMEc$Backup_DCNAMEnight-all.bks” /n “DCNAME.bkf” /d “Nightly” /v:yes /r:no /rs:no /hc:off /m normal /j “Nightly” /l:s /f “\DOMAINCONTROLLERNAMEC$Backup_DCNAME%DIRSTR%DCNAME.bkf


 


EXIT


 


:SKIP


 


C:WINDOWSsystem32ntbackup.exe backup “@\DOMAINCONTROLLERNAME c$Backup_DCNAME night.bks” /n “DCNAME.bkf” /d “SS_Nightly” /v:yes /r:no /rs:no /hc:off /m normal /j “SS_Nightly” /l:s /f “\DOMAINCONTROLLERNAMEC$Backup_DCNAME%DIRSTR%DCNAME.bkf


 


EXIT


            *** END BAT FILE ***


 


Comments and legend for the above batch file.


 


First I turn of echo


Next section builds the current date, we will use this to sort the backup and delete ones older than 7 days.


If not exist statement is useful if you store anything on a SAN drive. If they SAN is down normally the backup will fail, so I check before and then just run night to just backup the system state.


First backup runs a backup of the DOMAINCONTROLLERNAME (replace with a real name), creates a new file called DCNAME.bkf (replace with server name), next I label the backup Nightly, verify is then turned on, restricted access is NOT turned on – you may want it though (so /r:yes would turn it on), hardware compression is turned off, the backup type is set to normal, next the job name is set for backup reporting, next logging is set to summary, and finally the file name is created with the date and time stamp at the front (again for sorting and deleting older files later).


 


4)       To schedule a task to automatically backup a machine. Go to Start-> Control Panel -> Scheduled Tasks -> Add Scheduled Task.  During the configuration the following options are selected:


 


Scheduled Task Wizard

Next

Scheduled Task Wizard

Application

 

Browse

C:backuptoolsnightlybackup.bat

Open

Scheduled Task Wizard

Type a name for this task

Perform this task

 

Nightly

Daily

Next

Scheduled Task Wizard

Start Time

Perform this task

Start date

 

3:15 AM

Every Day

(today)

Next

Scheduled Task Wizard

Enter the user name

Enter the password

Confirm password

 

 

 

Next

Scheduled Task Wizard

Open advanced…

 

Unchecked

Finish


 


5)      Finally I use robocopy to clean up, so the backup won’t fill up the disk.


c:backuptoolsrobocopy c:backup c:backupt *.bkf /MOV /MINAGE:10 rd /q /s c:backupt


I use Robocopy to look at the c:backup and create a c:backupt folder (for temporary). I look for anything that ends with .bkf which is then Moved to the temporary (c:backupt) folder, if it is at least 10 days old. This process is done quietly and uses a sub directory of c:backupt.


 


Enjoy, let me know if you have any questions or issues.

And another nice feature for Windows 7

Homegroup certainly make home networking easier, especially when the PC is part of a domain as well, it just works. No more fiddling with rights between home accounts and domain users.

Update on Media Center on Windows 7

Since the upgrade of my Media Center PC to Windows 7 I have had a few problems with fast forward on recorded TV and DVDs (which I had not seen on Vista). It was as if the fast forward button on my remote got jammed on and I could not go back to standard playback easily, it took a few seconds for the message to get through, then you ended up back where you started.

I decided it was probably a CPUprocessing speed issue, so upgrade my 3 year old AMD/ASUS single core motherboard to a current entry level dual core system, a MSI motherboard and Intel E5300 Dual Core (the brand choice was just down to what was cheap and in stock at my local supplier). This fixed the issue completely, but did require a reinstall of Windows 7, as the Intel Dual Core needed a different HAL to the AMD single core. However the reinstall was not a major issue as I run a dedicated PC as a Media Center and it is practically a default installation,

Recording of my SQLBits Session on Visual Studio 2008

A webcast recording of SQLBits IV session ‘Making the SQL developer one of the family with Visual Studio Team System’ is now available on the SQLBits site. This discusses the features of the VS2008 Database GDR Edition.

Unfortunately I will not be proposing a session for this years SQLBits community event on the 21st of November 2009 at Celtic Manor in Newport, as I will be travelling back from the Microsoft PDC in LA

SQLBitsLogo

GetType and TypeOf confusion

Both VB and C# have an operator called TypeOf (or typeof in C#) but they perform two completely different things.

In VB there are also two kind of GetType() calls, the object.GetType() method which is part of the .Net framework and the VB language specific GetType() operator.

Are you confused yet? Don’t worry, in this article I will try to explain the difference between these operators and the object.GetType() method.

The TypeOf, typeof, GetType operators

The VB TypeOf operator is used together with the Is keyword and is used for checking if an object is of a particular type.

result = TypeOf x Is String


If “x” above is a string then “result” would be True otherwise it is set to False. This operator have existed in VB since long before .Net was born. The equivalent for the TypeOf operator in C# is simply called the is operator.



result = x is string;



The typeof operator in C# on the other hand returns an instance of the System.Type class containing type declarations of the type you pass to it.



Type t = typeof(string);


The VB equivalent of the C# typeof operator is the GetType operator.



Dim t As Type = GetType(String)


The reason for the different names are simply because of the fact that TypeOf was already a reserved keyword in VB and I guess the C# team didn’t have VB in their mind when they designed C#. :)



The difference between Object.GetType and the GetType operator



On a trivial level, the Object.GetType() method operates on an object instance while the GetType (and C#’s typeof) operator operates on a type.



Dim s As String
Dim t As Type = s.GetType()
Dim t2 As Type = GetType(String)



There are no differences between “t” and “t2” in the above code. So why do we have to have both of them? Well, you might not know what type a certain reference is made of in which case you obviously can’t use the GetType operator since that requires that you pass the type. Have a look at the following example:



Public Class MyBaseClass
End Class

Public Class MyDerivedClass
  Inherits MyBaseClass
End Class

Module Test
  Public Sub ShowType(ByVal obj As MyBaseClass)
    Dim t As Type = obj.GetType()
    Dim t2 As Type = GetType(MyBaseClass)
    Console.WriteLine(t)
    Console.WriteLine(t2)
  End Sub

  Public Sub Main()
    Dim myObject As New MyDerivedClass
    ShowType(myObject)
    Console.ReadLine()
  End Sub
End Module


In this example the ShowType method takes a MyBaseClass parameter and t2 uses the GetType operator to get the System.Type representation of MyBaseClass. So when you write out t2 to the console it will write MyBaseClass. However when you use the GetType method on the obj parameter it will write MyDerivedClass to the console. It is legal to pass a reference to MyDerivedClass to the ShowType method since it inherits from MyBaseClass so that will be a widening conversion. So inside the ShowType method we will never know if the parameter contains the type we have declared it as or if it contains any subclass of it, to find out we need to use the Object.GetType method.



Another difference between the method and the operator is when you’re dealing with value types.



Dim i As Integer
Dim t As Type = i.GetType()
Dim t2 As Type = GetType(Integer)



In order to call i.GetType() above i must first be converted to an Object which means a boxing conversion has to be done. Using the operator no boxing conversions are made which in this case makes that call faster.



I hope this removes some of the confusion about this subject, if not please feel free to leave a comment.



Have fun!

VSLP 2.0 (Visual Studio Learning Pack 2.0)

Microsoft Visual Studio Learning Pack 2.0 es un paquete de software creado por Microsoft para ayudar a los estudiantes a aprender programación. Consta de cinco componentes diseñados para el aprendizaje de conceptos para la programación de software.

Para mayor información y descarga la encontrarán en el post de Andy Gonzalez: http://blogs.msdn.com/andygon/archive/2009/07/01/vslp-2-0-visual-studio-learning-pack-2-0.aspx

[MOSS 2007 – SSRS2008] Componentes de integración de SQL Server Reporting Services no aparece en la Administración de Aplicaciones de la Herramienta Central de Administración de SharePoint 2007

En esta última semana me volví a encontrar con el mismo problema de integración entre MOSS 2007 y SSRS 2008, la solución que publiqué hace algún tiempo atrás, también aplica para esta versión de SSRS 2008.

La solución está en el siguiente link: http://msmvps.com/blogs/ffagas/archive/2008/08/11/moss-2007-ssrs2005-componentes-de-integraci-243-n-de-sql-server-reporting-services-no-aparece-en-la-administraci-243-n-de-aplicaciones-de-la-herramienta-central-de-administraci-243-n-de-sharepoint-2007.aspx

Publicación a internet de TFS 2010 (Beta 2)

Nota: Esta publicación no aplica para TFS 2010 Beta 1.

Aaron Block’s (TFS PM) ha elaborado un documento de cómo publicar el TFS 2010 hacia internet para que la comunidad lo revise y exponga su feedback sobre el mismo.

El documento lo pueden encontrar en el siguiente post de su blog.

Sitio de los VSTS Rangers

El 22 de Agosto Willy-Peter Schaub publicó que ya se encuentra al aire un sitio oficial en msdn para los VSTS Rangers, VSTS Rangers es un grupo de expertos (Empleados Microsoft, MVPs, Líderes de Comunidad) de VSTS involucrados en un sin número de proyectos (herramientas, guías, documentación) sobre VSTS, los mismos que buscan mejorar la experiencia de los usuarios de Visual Studio Team System.

El link del sitio es: http://msdn.microsoft.com/en-us/teamsystem/ee358786.aspx

[Cómo hacer] MsiInstallProduct return code: 1603 – Restauración de sitios de Team Foundation Server 2008

Descripción del problema


Cuando se trabaja con Team Fundation Server 2008 se pueden distribuir los diferentes productos y componentes que forman parte de la solución en diferentes servidores. Algunas ocasiones   los sitios para los team projects que se encuentran en el Team Foundation Server 2008 pueden estar en un servidor de MOSS 2007. Debido a una actualización y desconocimiento del administrador de dicho servidor, en el servidor de MOSS 2007 se pueden eliminar todos los sitios del TFS 2008 y puede que estos no se encuentren respaldados, adicionalmente puede darse el caso que la actualización sea hacia un servidor de 64 bits, a partir de ese momento cuando quieran subir una nueva guía de proceso, crear un Team Project o agregar documentos les presentará un error.


Nota: Se considera para este escenario que se mantiene el nombre del servidor de SharePoint y se crea una aplicación web en el mismo puerto previo a la acción de eliminación.


Solución al problema


El procedimiento normal sería volver a instalar las extensiones del TFS para SharePoint pero debido a que ahora el servidor es de 64 bits es necesario primero descargue desde el sitio de descarga las extensiones de TFS para Sharepoint de 64 bits (WssExt64Bit.msi), de lo contrario al momento de instalar las extensiones que viene en el instalador de TFS 2008 por defecto le presentará el siguiente mensaje de error:


 


errorTFS


Y en el log del instalador les retornará el siguiente error:


[08/31/09,10:59:49] Setup.exe: AddGlobalCustomProperty
[08/31/09,10:59:49] Microsoft Visual Studio 2008 Team Foundation Server SharePoint Extensions – ENU: ***ERRORLOG EVENT*** : ERROR:SharePoint must be installed to install this product.
[08/31/09,10:59:49] Setup.exe: AddGlobalCustomProperty
[08/31/09,10:59:50] Microsoft Visual Studio 2008 Team Foundation Server SharePoint Extensions – ENU: ***ERRORLOG EVENT*** : ACTION FAILURE:Action ended 10:59:49: CA_LaunchCondition_STSADM_Present. Return value 3.See MSI log for details.
[08/31/09,10:59:50] Setup.exe: AddGlobalCustomProperty
[08/31/09,10:59:50] Microsoft Visual Studio 2008 Team Foundation Server SharePoint Extensions – ENU: ***ERRORLOG EVENT*** : ACTION FAILURE:Action ended 10:59:50: INSTALL. Return value 3.See MSI log for details.
[08/31/09,10:59:50] Microsoft Visual Studio 2008 Team Foundation Server SharePoint Extensions – ENU: CRootComponent::Install(): Finished MSI Installation
[08/31/09,10:59:50] Microsoft Visual Studio 2008 Team Foundation Server SharePoint Extensions – ENU: MsiInstallProduct return code: 1603.


 


Nota: el link hace referencia al release de octubre de 2008, procure que sea el último release.


 


Al finalizar la instalación de las extensiones del TFS para SharePoint vuelva a ingresar los valores anteriores en la configuración de las extensiones de SharePoint indicando las siguientes urls: sitios de TFS http://servidorsharepoint:puerto/sites, servidor de reportes: http://servidorreportes/reports y report server: http://servidorreportes/reportserver.


A partir de esta configuración ya podrá volver a utilizar el Team Foundation Server con normalidad.


Para los Team Projects existentes una solución sería crear los sitios desde la administración central de SharePoint (Nota: Por razones de seguridad he borrado direcciones de servidor).


a. Desde la herramienta central de administración de MOSS 2007 seleccionar la aplicación web que contendrá los sitios de los proyectos.


CrearTeamProject01


CrearTeamProject02


 


b. Una vez seleccionada la aplicación se debe seleccionar crear colección de sitios


CrearTeamProject03


c. En la pantalla de creación de sitios escribir el mismo nombre del team project en los campos: title y url, en la sección de la plantilla seleccionar la plantilla de su guía de proceso y en el administrador ingresar tfssetup, dar clic en aceptar.


 


 


 


 


 


CrearTeamProject04

Boxing and Unboxing in .Net

In this article I will try to explain the concepts of Boxing and Unboxing. There are two types of objects within the .Net framework, value types and reference types. Value types are stored on the stack while reference types are always stored on the heap. Converting a value type into a reference type is called Boxing while converting it back from a reference type to a value type is called Unboxing.

Value Types

Value types are primitive types like System.Int32 (Integer in VB, int in C#) or System.Double. These types are mapped directly to the Framework Class Library (FCL) and are all derived from System.ValueType. Apart from the regular primitive types (integer, long, single, double, char, and so on) Enumerations and Structures also inherits from System.ValueTypes so they are also stored on the stack.

Reference Types

All classes are reference types and they are stored on the heap. So when you create a new instance of a class using the new keyword, memory are allocated on the heap for the object and the memory address is returned. Strings are a special case since they are also reference types, even though they are often treated as if they where value types, but that’s not the case. The difference is that you don’t have to create an instance of a string in the same manner as you would with another class.

Reference types are also Nullable, meaning that you can set the reference to Null (Nothing in VB).

Widening and Narrowing conversions

Before we go into the Boxing and Unboxing I would like to take a moment to explain the difference between a widening and a narrowing conversion. A widening conversion is always safe since there is never any risk of any data loss. For example converting an Integer into a Long (System.Int64) is a widening conversion. There is no integer value that doesn’t fit into a Long so you can always do that. However doing the opposite, converting a Long into an Integer is more risky since there are a chance that you will lose some data, since a Long can contain a number that is too high (or too low) to fit into an integer.

You can always use an implicit conversion when you do a widening conversion.

Dim i As Integer = 3
Dim l As Long
l = i 'Implicit conversion



However to do the opposite, a narrowing conversion, you need to do an explicit conversion.



i = CType(l, Integer) 'Explicit conversion
'or
i = CInt(l) 'Explicit conversion



If you don’t do an explicit conversion when you’re doing a narrowing conversion you will get a compilation error so your project will not compile (Note, VB can allow you to do an implicit conversion if you have Option Strict set to Off, something I strongly advice you not to have, C# however will never allow it). If the Long value does not fit into the Integer you will however still get a runtime error since the system will then throw a System.OverflowExeption. So when you do a narrowing conversion you must be sure that it can be done.



Doing Boxing/Unboxing



As mentioned earlier Boxing refers to the conversion of a value type to a reference type.



Dim i As Integer = 3
Dim o As Object
o = i 'Boxing the Integer into an object (reference type)



Above I do an implicit conversion of an integer to an object. This is possible since it’s a widening conversion so there is no risk of any data loss. However doing the opposite, converting the object to an integer is a narrowing conversion, where there is a risk that data will be lost.



i = o 'Implicitly doing a narrowing conversion (not allowed)
i = CInt(o) 'Explicitly doing a narrowing conversion (allowed)



In C# if you have boxed a particular type you must unbox it to the same type. You can not box an integer to an object and then unbox it to a long.



int i = 3;
long l;
object o;
o = i;
l = (long)o; //will throw a System.InvalidCastException


For the above to work, you must first unbox it to an integer and then cast it to a long.



int i = 3;
long l;
object o;
o = i;
l = (long)(int)o;



However the VB’s type conversion functions (CType and the more specific CInt, CLng, CDbl and so on) does allow this.



Dim i As Integer = 3
Dim l As Long
Dim o As Object
o = i
l = CLng(o) 'allowed



End words



I hope that you by now have a better understanding of Boxing and Unboxing. So let me end this article by saying that you should avoid doing this if possible. Doing type conversion is time consuming, and that is especially true when it comes to Boxing and Unboxing since data have to be transferred back and forth between the stack and the heap. Using Generics, which has existed in .Net since version 2.0, is a much better approach than having to deal with value type to reference type conversion.



I will cover Generics in a later article.



Have fun.

INSERT and UPDATE loading practice – The impact of forwarding pointers.

Ever since I’ve been working with databases I have come across the following pattern of programming. The pattern is an INSERT and LOTS of UPDATEs. The reason for this pattern is reasonable, and is generally due to the desired data set being complex to generate from the source database.


For this reason a base set of data is generated and then updated by a number of updates.


I’ve seen this is two main scenarios, integration and warehousing. The former is due to a target system wanting something that the soruce system had in a different structure or a differnt level and the latter generally wants calculated data like number of children, or name of last dentist visited.


You might think these should easily be solved by one query, and if there was only on of these bits of infomation required that would be easy. However more often than not there are many, and even worse they are depenent, i.e. given the last dentist someone visted, give me the number of people that have sued said dentist.


So logically people break this down into manageable chunks. And do something like,


 


INSERT INTO CustomerDetails


SELECT somecolums


FROM Customers


 


 


ALTER sometable ADD NumberOfOrders int, NumberOfChildren int, NumberOfPeopleSued int, LastDentistVisited varchar(100)


 


UPDATE CustomerDetails


   SET NumberOfChildren = (SELECT COUNT(1) FROM ….)


 


UPDATE CustomerDetails


   SET LastDentistVisited = (SELECT COUNT(1) FROM ….)


 



 


UPDATE CustomerDetails


   SET NumberOfPeopleSued= (SELECT TOP 1 Surname FROM ….)


 


The reason for this post is that this approach is right for 1 reason and wrong for many more.


It is right because it is good practice to break small problems down into manageable understandable chunks as this aids understanding and means code is easier to support in the future.


It is wrong for many reasons of performance and one functional reason. I will cover the latter in a later post but will mention the performance reasons now.


1. The addition of the columns means each row has to be adjusted to fit the new columns. With fixed width columns this can be very expensive as columns have to be moved about. If you have a need for this the columns should be added in the insert statement. To get the correct type, use cast, i.e. CAST(NULL as int) or CAST(NULL as varchar(24))


2. The second reason is that for each UPDATE each row has to be updated. If you have N updates that means each page is touched at least N times. Which is not good for performance. If you have to do this the solution is to try and batch up dates to do it in one pass, by using derived tables and CASE statements. i.e.


UPDATE <table>


   SET Col1 = CASE WHEN <some criteria>


                   THEN <a new value> ELSE Col1 END


     , Col2 = CASE WHEN <Some other criteria>
                   THEN <some other new value> ELSE Col2 END


3. The final reason is one of forward pointers. Even if you add your additional columns by adding them in the INSERT statement, if any of them are variable length then they will result in FORWARDED RECORDS when you start doing an update. Whats a forwarded record, well it occurs when a row has to move because it no longer fits on the page it was put to start with. Its just like when you move house and redirect your mail. SQL Server maintains that your record is still located in its original location but in that location is a pointer to another location, in another page in your database. Why is this bad. Because it means SQL server has to perform two IOs to be able to read the data, and also it has to do more work to update the data. More work equals worse performance.


So lets have a look at forward pointers.


In this example we are going to create a table, and put 1 row in it. We will then put another row in the table that fills the page.


use tempdb


go


drop table heap


go


create table heap (id int not null, v1 varchar(5000))


go


insert into heap(id,v1)


select 1, ''


go


select object_id, partition_number, alloc_unit_type_desc, record_count, avg_record_size_in_bytes, forwarded_record_count


from sys.dm_db_index_physical_stats(db_id('tempdb'),object_id('heap'),0,null,'detailed')


go


insert into heap(id,v1)


select 2, REPLICATE('#',4000)


go


select object_id, partition_number, alloc_unit_type_desc, record_count, avg_record_size_in_bytes, forwarded_record_count


from sys.dm_db_index_physical_stats(db_id('tempdb'),object_id('heap'),0,null,'detailed')


go


select * from heap


cross apply sys.fn_PhysLocCracker(%%physloc%%)


 


What you should see is that after the first insert the DMV dm_db_index_physical_stats  is reporting 1 row and no forwarded records. After the second insert we get 2 tows and no forwarded records. Not we are also using a SQL 2008 undocumented feature %%physloc%% and fn_PhysLocCracker.  Both of these are undocumented, but are very handy as they show the physical location of a row, the File, Page and Slot.  There is also a scalar function as well. Thanks to Paul Randal for the pointer http://www.sqlskills.com/blogs/paul/post/SQL-Server-2008-New-(undocumented)-physical-row-locator-function.aspx  . What we will see is that this returns the original location and not any forwarded location.


Now if we update the first row so its too big to fit on the page what do you think happens.


update heap


set v1= REPLICATE('@',5000)


where id = 1


go


select * from heap


cross apply sys.fn_PhysLocCracker(%%physloc%%)


go


select object_id, partition_number, alloc_unit_type_desc, record_count, avg_record_size_in_bytes, forwarded_record_count


from sys.dm_db_index_physical_stats(db_id('tempdb'),object_id('heap'),0,null,'detailed')


 


Well you will see that we now have a forwarded record, but %%PhysLoc%% is still saying its in the same place. If we use DBCC IND we can see that the table now has 3 pages, an allocation page and two data pages.


dbcc ind(tempdb, heap,1)


go


dbcc traceon (3604)
go
dbcc page (tempdb,1,???,3)
dbcc page (tempdb,1,???,3)


Using DBCC PAGE we can look at the original page and you will see something like this. (note the ??? change these to the page numbers returned from DBCC IND where PageType = 1)


Slot 0 Offset 0x101a Length 9


 


Record Type = FORWARDING_STUB        Record Attributes =                  Record Size = 9


 


Memory Dump @0x0000000011A1B01A


 


0000000000000000:   04000900 00010000 00†††††††††††††††††..     ……       


Forwarding to  =  file 1 page 2304 slot 0


This highlights that the row has moved to page 2304 (your page will probably be different)


If we look at that page (which will be the second DBCC PAGE above) you will find your row.


So what happens if we move house again, what does SQL Server do. To check it out we need to add two rows, 1 fills up the first page and one fills up the second page. If we then update the first row so that its too big for its new page, it should move.


 


insert into heap(id,v1) select 3, REPLICATE('#',4000)


insert into heap(id,v1) select 4, REPLICATE('#',1000)


go


select * from heap


cross apply sys.fn_PhysLocCracker(%%physloc%%)


go


select object_id, partition_number, alloc_unit_type_desc, record_count, avg_record_size_in_bytes, forwarded_record_count


from sys.dm_db_index_physical_stats(db_id('tempdb'),object_id('heap'),0,null,'detailed')


go


update heap


set v1= REPLICATE('@',8000)


where id = 1


go


select * from heap


cross apply sys.fn_PhysLocCracker(%%physloc%%)


go


select object_id, partition_number, alloc_unit_type_desc, record_count, avg_record_size_in_bytes, forwarded_record_count


from sys.dm_db_index_physical_stats(db_id('tempdb'),object_id('heap'),0,null,'detailed')


go


dbcc ind(tempdb, heap,1)


Doing the insert we will see that the 4th row now appears on the second page but the other three appear to be on the first page. After the update nothing changes and the forward record count is still 1. So whats happened.


Well if we look at the first page again, using DBCC PAGE you will find that the page number that the forward pointer points to has changed to a new page. If you run DBCC IND you will see you now have a 3rd data page. Interestingly if you have a look at the second page, you will see that the slot 0 when the old forwarded record resided is now not being shown, thats because there is nothing in it. Slot 0 is now free.


So the bottom line is that if you are resorting to use such an approach doing an INSERT and then an UPDATE forwarding records can cause you a problem, not just during reading but also when updating because data has to be moved. If you have to use this approach then you are best to preallocate space in the table this ensures that the row doesn’t move and performance will improve. I’ve seen 50% improvements in systems by allocating defaults to varchar columns using replicate for instance.


So what should you do.


Well whilst you might be able to right 1 query that achieves your goal, it is probably only supportable by the person that wrote it or someone from MENSA. So breaking it up is a good thing, but the use of updates is a bad thing. You will probably find that it is quicker to perform 5 insert statements and then joined them all together rather than 1 INSERT and 4 UPDATES. If you use SELECT INTO it will be a bulk logged operation, you can then add a clustered index on your key which should then allow the final query to be an efficient MERGE  JOIN.


If you can you could use INSERT INTO with the CLUSTERED INDEX already inplace and use trace flag 610 http://blogs.msdn.com/sqlserverstorageengine/archive/2008/03/23/minimal-logging-changes-in-sql-server-2008-part-2.aspx This would normally be the quickest option.


ps. One nice feature in SQL 2008 is that the ALTER TABLE <heap> REBUILD, removes forward pointers.







Recent Comments

Archives