Thursday, March 31, 2011

Free WP7 ebook

Everyone likes things for free.  So here is a free ebook “Silverlight for Windows Phone” written by Puja Pramudya from Microsoft Innovation Center, Indonesia. This e-book is written for those who want to get to know, use, and develop applications for Windows Phone, Microsoft’s latest mobile platform.

The download link and more information here.

I didn’t had time yet to read it myself, but it’s on my list…

image

Wednesday, March 30, 2011

Problems using the .NET transactionscope with DB2 on a 64 bit machine

To solve the growing need for more memory, we decided to upgrade all our development machines to a 64bit OS.  Although this allows us to install and use more memory on our machines, it also introduces a whole list of new problems.

One of them was regarding the usage of the .NET transactionscope in combination with DB2. In DB2 managing your transactions through the transactionscope will immediatelly involve the DTC coordinator. However the moment the DTC tries to open a transaction, it fails with the following error message:

The XA Transaction Manager attempted to load the XA resource manager DLL. The call to LOADLIBRARY for the XA resource manager DLL failed: DLL=C:\Program Files\IBM\SQLLIB\BIN\DB2APP.DLL, HR=0x800700c1, File=d:\w7rtm\com\complus\dtc\dtc\xatm\src\xarmconn.cpp Line=2446.

After trying almost every possible solution, we finally found a solution that worked for us. When installing the DB2 client, it doesn’t correctly register the 64bit dll’s in the register.

The following registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSDTC\XADLL incorrectly referenced the db2app.dll instead of the db2app64.dll. As a consequence the MSDTC tries to load a 32bit DLL on 64bit machine, which fails. After changing the reference to the 64bit dll everything worked.

More information in this thread on the DB2 forums.

Tuesday, March 29, 2011

TFS 2010: Cleaning test attachments

After we started to use the Visual Studio Test Professional for managing and executing our test scenario’s, I noticed that our TFS databases grew a lot faster in size.

This is not completely unexpected as you know the large amount of data you can capture while running your tests:

  • Video’s
  • Screenshots
  • Intellitrace files
  • System information
  • Event Log data
  • Test Impact data

I was looking for some possible solutions when I noticed that Microsoft already released a Visual Studio Power tool that tackles this problem.

With the Test Attachment Cleaner TFS administrators can

1. Determine which set of diagnostic captures is taking up how much space AND

2. Reclaim the space for runs which are no longer relevant from business perspective.

Usage is simple through the command line. This makes it easy to add this as a scheduled tasks on your TFS server. Some sample scenario’s are also included in the download package.

More information about the Test Attachment Cleaner on Grant Holliday’s blog.

Monday, March 28, 2011

WPF Inspector

Debugging databinding and the visual tree of a WPF application can be hard. Last week I found WPF inspector on Codeplex.

“WPF Inspector is a utility that attaches to a running WPF application to troubleshoot common problems with layouting, databinding or styling. WPF Inspector allows you to explore a live view of the logical- and visual tree, read and edit property values of elements, watch the data context, debug triggers, trace styles and much more.”

It allows you to attach to your running WPF applications and start browsing through a lot of available information like properties, the datacontext, triggers and so on… This simplifies debugging a lot.

A must have for every WPF developer out there.

Inspecting logical and visual tree

Saturday, March 26, 2011

Windows Azure Toolkit for Windows Phone 7

Microsoft released last week the Windows Azure Toolkit for Windows Phone 7. Downloadable here, the new toolkit is helps ease the building Windows Phone 7 applications that use services running on Windows Azure.

The toolkit provides you with:

  • Binaries for your Windows Phone 7 applications
  • Project templates to optimize new phone application creation
  • Sample applications in both C# and VB.NET
  • A dependency checker that checks the prerequisites required by the toolkit
  • Setup and configuration documentation, toolkit content review, a getting started walkthrough, and troubleshooting tips

Get started now!

Friday, March 25, 2011

Windows Azure TCO Calculator

One of the things that most of my customers find challenging when moving to the cloud, is the whole pricing model. They have a hard time finding out what the cost of moving to a cloud platform like Windows Azure would be. Therefore Microsoft introduces the Windows Azure platform TCO calculator, and in 10 minutes or less, you’ll see how Windows Azure compares to on-premises solutions, quantify migration costs, and get a pricing overview.

AzureTCOCalculator

Thursday, March 24, 2011

Visual Studio 2010 Code Map extension

A colleague recommended me the following Visual Studio extension: VS10x Code Map.

VS10x Code Map is a Visual Studio 2010 extension that displays a graphical nested representation of the current code editor window (C# and VB.NET). It helps the developer visualize type nesting, implemented interfaces, regions, member type and scope, as well as quickly navigate to their respective positions in code.”

Some of the features it offers:

  • Code Visualization
  • Code Collapsing and Sync
  • Persistent History
  • Highlighting Favorite Items
  • Named Bookmarks
  • Filtering
  • Color Themes

Wednesday, March 23, 2011

Entity Framework 4: Disable change tracking

The Entity Framework ObjectContext implements the unit of work pattern and does all the change tracking for you. However sometimes you just need to get some readonly data(for example in a CQRS architecture). In that case change tracking is just an extra performance penalty. 

How to Disable the Change Tracking Option

In Entity Framework, change tracking is being done by the ObjectStateManager. The ObjectStateManager maintains object state and identity management for entity type instances and relationship instances. In order to disable change tracking you use the MergeOption.NoTracking option of the MergeOption enumeration. Using that option will retrieve entities in a detached state.

There are 2 places where you can specify this option:

One place to disable the change tracking is when we retrieve data with the ObjectQuery object. One of the overloaded constructors of the ObjectQuery
gets as a parameter the MergeOption enumeration.

var query = new ObjectQuery<Employee>("SELECT VALUE e FROM NorthwindEntities.Employee AS e",context,MergeOption.NoTracking);

Another place to disable the change tracking is on the EntitySet level.

context.Employee.MergeOption = MergeOption.NoTracking;

I still like the more explicit approach of the NHibernate Stateless Session more. But that’s just me…

Tuesday, March 22, 2011

The last app in the Cloud…

Too funny to ignore(although my girlfriend didn’t understand the clue Smile):

Monday, March 21, 2011

Entity Framework 4: Unable to load the specified metadata resource

After some refactoring, my integration tests started to fail with the following error message:
“Unable to load the specified metadata resource.”
I found out that the error was raised when I initialized the Entity Framework datacontext.  It looked like that EF couldn't find my meta description files(csdl , msl, ssdl files). So where did they go?
image

I opened up Reflector, and browsed to the DLL where my datacontext was stored. I noticed that the name of the files was prefixed with an extra DB part. This was caused because I had moved the context file under a new subfolder inside the Visual Studio project. After I updated the connection string information, all tests were green again.

<connectionStrings>
<add name="EventsEntities" connectionString="metadata=res://*/DB.EventsModel.csdl|res://*/DB.EventsModel.ssdl|res://*/DB.EventsModel.msl;provider=..." />
</connectionStrings>

Sunday, March 20, 2011

The InvokeProcess workflow activity

With the new build engine based on Workflow 4 in Team Foundation Server 2010, creating build definitions and automatic builds became a lot easier. One of the most useful activities is the InvokeProcess activity. This activity offers the following important properties:

  • Arguments: the argument values you want to pass to the invoked process
  • FileName: name of the process you want to execute
  • WorkingDirectory: location from which the process should be invoked.

image

One feature that makes debugging a lot easier is that you can setup the activity for logging(by default nothing will be logged other than the command line invoked). If you expand the activity in the designer, you can see that the Standard and Error output is captured in two variables(stdOutput and stdError). You can then add the WriteBuildMessage and WriteBuildError activities by dragging them from the toolbox into the handler areas of the InvokeProcess activity.

InvokeProcessActivity

Saturday, March 19, 2011

Excluding Managed Paths in WSS 3.0

After upgrading to WSS 3.0 I noticed that I no longer could exclude managed paths for directories that you don't want 'managed' by SharePoint. So how do you do this then? The answer is simple, you don’t have to do anything. Just create your virtual directory in IIS and that's it. WSS will just ignore it.

One extra thing to remember: don’t forget to perform an IISRESET, otherwise SharePoint isn't yet aware of the new mapping.

Friday, March 18, 2011

Enterprise Library 5 Validation Application block now supports Silverlight

While browsing around, I noticed a new code drop on the EntLib codeplex site containing the Enterprise Library 5.0 Silverlight Integration Pack. Most important features are:

  • port of the Validation Application Block
  • new pieces of the infrastructure to support XAML-based configuration
  • async configuration loading

Note there is also a conversion tool to facilitate the reuse of your Enterprise Library existing configurations and convert them to XAML. It can be plugged in the config tool via a wizard or you may integrate it in your build process. The conversion tool is still rough and we are iterating on it.

Both a demo and  a short screencast are available to get you started.  

Thursday, March 17, 2011

Windows Azure SDK 1.4

The announcements of new releases continues. Microsoft has updated the Windows Azure SDK to version 1.4 and added some improvements to the Windows Azure Management Portal.

Bug Fixes:

  • Resolved an issue that caused full IIS fail when the web.config file was set to read-only.
  • Resolved an issue that caused full IIS packages to double in size when packaged.
  • Resolved an issue that caused a full IIS web role to recycle when the diagnostics store was full.
  • Resolved an IIS log file permission Issue which caused diagnostics to be unable to transfer IIS logs to Windows Azure storage.
  • Resolved an issue preventing csupload to run on x86 platforms.
  • User errors in the web.config are now more easily diagnosable.
  • Enhancements to improve the stability and robustness of Remote Desktop to Windows Azure Roles.

New features:

  • Windows Azure Connect:
    • Multiple administrator support on the Admin UI.
    • An updated Client UI with improved status notifications and diagnostic capabilities.
    • The ability to install the Windows Azure Connect client on non-English versions of Windows.
  • Windows Azure CDN:
    • Windows Azure CDN for Hosted Services: Developers can now use the Windows Azure Web and VM roles as"origin" for objects to be delivered at scale via the Windows Azure CDN. Static content in a website can be automatically edge-cached at locations through out the United States, Europe, Asia, Australia and South America to provide maximum bandwidth and lower latency delivery of website content to users.
    • Serve secure content from the Windows Azure CDN: A new checkbox option in the Windows Azure management portal enables delivery of secure content via HTTPS through any existing Windows Azure CDN account.

Download the new Windows Azure 1.4 SDK here.

Wednesday, March 16, 2011

New free Windows Azure trial offer

Microsoft has upgraded it’s Introductory Special offer. It now includes 750 free hours of the Windows Azure extra-small instance, 25 hours of the Windows Azure small instance and more, until June 30, 2011. This extended free trial will allow developers to try out the Windows Azure platform without the need for up-front investment costs*.

[windows_azure[3].jpg]

To see full details of the new Introductory Special and to sign up, please click here.

 

* You still need a credit card to sign up

Tuesday, March 15, 2011

JQuery 1.5.1 released

jQuery 1.5.1 is now out! This is the first minor release on top of jQuery 1.5 and lands a number of fixes for bugs.

You can download 2 versions of jQuery, one minified and one uncompressed (for debugging or reading).

Monday, March 14, 2011

Visual Studio 2010 SP1 and TFS 2010 SP1 available for download

Last week VS 2010 SP1 and TFS 2010 SP1 have shipped and the Team Foundation Server Project Server Integration Feature Pack is now available for MSDN subscribers. Both service packs are available here:

SP1

SP1 has a handful of nice new features (like IIS Express support and support for the Project Server Feature Pack) and hundreds of bug fixes. The list of changes you’ll find here:

TFS Project Server Integration Feature Pack

Integration between Project Server and Team Foundation Server is a considerable advancement for organizations that want to bridge the collaboration gap between the Project Management Office and software development teams. I blogged about this before.

Sunday, March 13, 2011

Team Foundation Power Tools March 2011 Release

Microsoft released an update to the Team Foundation Server Power Tools. You can download them here:

Backup/Restore Power Tool

The backup/restore Power Tool was already available in the last Power Tools release but due to a large amount of bugs it was not usable. In this release Microsoft fixed about every bug that was reported. 

Windows Shell Extension

Microsoft knows that not everyone is using source control from within Visual Studio. The shell extension allows you to manage your source controlled file right within your folder windows. In this release a lot of new commands were added to the context menu:

  • Workspace
  • History
  • Compare
  • Shelve/Unshelve

Here’s a screen shot showing the new commands:

image

TFS  Build Power Tools

This Power Tools release includes a new builddefinition command to the tfpt command line Power Tool that allows you to easily script some build management commands.  clip_image002

    For more information, consult Brian Harry’s blog.

    Saturday, March 12, 2011

    Unity InjectionFactory

    Although Unity 2.0 is out for a while, it took me until now to try some of the nice new features.  One of the nicest feature is the InjectionFactory which replaces the StaticFactoryExtension that was available in Unity 1.2.

    A small sample:

    using (var container = new UnityContainer())
    {
    container
    .RegisterType<IService, ConcreteService>(
    new InjectionFactory(c => new ConcreteService()));

    var service= container.Resolve<IService>();
    }

    Friday, March 11, 2011

    Cross domain handling on Azure blob storage

    This week a customer wanted to access the Azure blob storage from their Silverlight and Flash client applications. However due to the cross-domain restrictions of both technologies, we couldn’t access the blob storage out-of-the-box.

    Let’s see how we can solve this and enable full access to blob storage through Silverlight and Flash.

    ClientAccessPolicy.xml

    When a Silverlight application makes a cross-domain call (other than those that are allowed by default), it first fetches a file called ClientAccessPolicy.xml from the root of the target server. For the blob storage this will be something like http://{namespace}.blob.core.windows.net/ .

    Every blob in Windows Azure storage lives within a container, but there’s a special root container which lets us store blobs directly off the root of the domain. This is where we’ll put our ClientAccessPolicy.xml file. The following code creates a publicly readable root container and creates a blob named ClientAccessPolicy.xml within it:

    private void CreateSilverlightPolicy()
    {
    var account= new CloudStorageAccount
    (
    new StorageCredentialsAccountAndKey("account", "key"), 
    new Uri("http://sample.blob.core.windows.net"),
    new Uri("http://sample.queue.core.windows.net"),
    new Uri("http://sample.table.core.windows.net")
    );
    var client = account.CreateCloudBlobClient();
    blobs.GetContainerReference("$root").CreateIfNotExist();
    blobs.GetContainerReference("$root").SetPermissions(
    new BlobContainerPermissions() {
    PublicAccess = BlobContainerPublicAccessType.Blob
    });
    var blob = blobs.GetBlobReference("clientaccesspolicy.xml");
    blob.Properties.ContentType = "text/xml";
    blob.UploadText(@"<?xml version=""1.0"" encoding=""utf-8""?>
    <access-policy>
    <cross-domain-access>
    <policy>
    <allow-from http-methods=""*"" http-request-headers=""*"">
    <domain uri=""*"" />
    <domain uri=""http://*"" />
    </allow-from>
    <grant-to>
    <resource path=""/"" include-subpaths=""true"" />
    </grant-to>
    </policy>
    </cross-domain-access>
    </access-policy>");
    }
    CrossDomain.xml

    For the Flash the story is almost the same but instead of adding a clientpolicy file, we need to add a crossdomain.xml file.

    private void CreateFlashPolicy()
    {
    var account= new CloudStorageAccount
    (
    new StorageCredentialsAccountAndKey("account", "key"), 
    new Uri("http://sample.blob.core.windows.net"),
    new Uri("http://sample.queue.core.windows.net"),
    new Uri("http://sample.table.core.windows.net")
    );
    
    var client = account.CreateCloudBlobClient();
    blobs.GetContainerReference("$root").CreateIfNotExist();
    blobs.GetContainerReference("$root").SetPermissions(
    new BlobContainerPermissions()
    {
    PublicAccess = BlobContainerPublicAccessType.Blob
    });
    var blob = blobs.GetBlobReference("crossdomain.xml");
    blob.Properties.ContentType = "text/xml";
    blob.UploadText(@"<?xml version=""1.0"" encoding=""utf-8""?>
    <cross-domain-policy>
    <allow-access-from domain=""*"" />
    </cross-domain-policy>");
    }

    Thursday, March 10, 2011

    Windows Azure Training Kit: Februari Update

    Microsoft seems to continue it’s monthly update process. The February 2011 update of the Windows Azure Platform Training Kit is now available for download. This new version includes several updated HOLs to include support for the Windows Azure AppFabric February CTP and the new AppFabric Caching, Access Control, and Service Bus portal experience.   

    Specific updates include:

    • HOL: Building Windows Azure Apps with the Caching service
    • HOL: Using the Access Control Service to Federate with Multiple Business Identity Providers
    • HOL: Introduction to the AppFabric Access Control Service V2
    • HOL: Introduction to the Windows Azure AppFabric Service Bus Futures
    • Improved PHP installer script for Advanced Web and Worker Roles
    • Demo Script - Rafiki PDC Keynote Demo

    Wednesday, March 9, 2011

    Unlimited load testing(for VS 2010 Ultimate users)

    Yesterday Microsoft announced the release of the Load Testing Feature Pack.  This gives Visual Studio Ultimate with MSDN users the ability to do unlimited load testing. 

    Before you had to buy separate Load Test Packs.  This has now all been included into Ultimate Edition (including the load agent software necessary to run a distributed load test simulation), enabling you to load test with as many virtual users as you want.

    Thank you Microsoft!

    For more information regarding this new Visual Studio 2010 Ultimate with MSDN benefit, visit the Visual Studio Load Test Virtual User Pack 2010 page.

    Friday, March 4, 2011

    Run Visual Studio post-build events for debug only

    I wanted to do some extra stuff when compiling a Visual Studio project but only when I was working in Debug mode. You can easily do this by adding the configuration name to the post-build script and check it in there to see if it should run. You can pass the configuration name with $(ConfigurationName).

    A sample:

    if $(ConfigurationName) == Debug (
    copy "$(TargetDir)test.dll" "c:\testrun\" /y
    )

    Thursday, March 3, 2011

    Is business logic a cross-cutting concern?

    As a consultant and architect I have the opportunity to see a lot of different applications and architectures. Most of these applications have a kind of so called business logic layer. It states the idea of a layer in any way capable of handling all business logic all by itself.  Although these architectures claim that all business logic “belongs” to this layer, I always find business logic in at least at least two places and sometimes three.
    Is there something wrong with these architectures or do we have to say that Business Logic really is a cross-cutting concern?

    Let me explain this a little bit more…

    A definition of business logic

    Let’s state that business logic includes schema (types and constraints), derived values (timestamps, userstamps, calculations, histories), non-algorithmic compound operations (like batch billing) and algorithmic compound operations, those that require looping in their code. This encompasses everything we might do from the simplest passive things like a constraint that prevents discounts from being over 100% to the most complex hours-long business process, along with everything in between accounted for.

    A sample application

    Consider an admin interface to a database, where the user is entering or modifying prices for the price list. Now, if the user could enter plain text as the price, that would be kind of silly, so of course the numeric inputs only allow numeric values. Same goes for dates.

    Now consider the case where the user is typing in a discount rate for this or that, and a discount is not allowed to be over 100%. The UI really ought to enforce this, otherwise the user's time is wasted when she enters an invalid value, finishes the entire form, and only then gets an error when she tries to save. In the database world we call this a constraint, so the UI needs to know about constraints to better serve the user.

    Is this user allowed to change a price? If not, the button should either be grayed out or not be there at all. The UI needs to know about and enforce some security.

    So in fact the UI layer not only knows the logic but is enforcing it. It is enforcing it for two reasons, to improve the user experience with date pickers, lists, and so forth, and to prevent the user from entering invalid data and wasting round trips.

    Do you see the business logic leaking in?

    You Cannot Isolate What Must be Duplicated

    The UI layer is completely useless unless it is also enforcing as much logic as possible, and even when we leave the Database Server as the final enforcer of business logic (types, constraints, keys), it is still often good engineering to do some checks to prevent expensive wasted trips to the server.

    This explains why a lot of people struggle with where to put the business logic. That struggle and its frustrations come from the mistake of imposing abstract conceptual responsibilities on each tier instead of using the tiers as each is able to get the job done. Databases are wonderful for type, entity integrity (uniqueness), referential integrity, ACID compliance, and many other things. Use them! Code is often better when the problem at hand cannot be solved with a combination of keys and constraints, but even that code can be put into the DB or in the application.

    Wednesday, March 2, 2011

    Improve your HTML(5) websites with @font-face

    For a long time web designers were limited to a very small set of “web safe” fonts. Anything beyond those fonts, had to be done through images. Images for text not only meant creating and maintaining dozens (if not hundreds) of images, but it introduced accessibility issues.

    @font-face to the rescue

    With the introduction of @font-face in 1998(!)  in the CSS 2 specification, web designers could link to actual font files via CSS:

    @font-face 
    {     
    font-family: "ChunkFiveRegular"; 
    src: url('Chunkfive-webfont.ttf') 
    format("truetype"); 
    }

    And then utilize those specified fonts in style declarations:

    h1 
    {     
    font-family: "ChunkFiveRegular", serif;
    } 
    

    However as the specification was there, the browser support wasn’t.. And then each browser vendor decided to support different, rarely-used formats. Plus, there were the licensing issues. Even if you had supported font formats, that didn’t mean you could legally use those fonts with @font-face.

    Browser Support Today

    Today all of the latest browsers support @font-face and many more file formats are supported, including the TrueType/OpenType TT (.ttf) and OpenType PS (.otf) formats. Unfortunately, “latest” doesn’t include Internet Explorer before version 9.


      .ttf .otf .woff .svg .eot
    Chrome v4+ v4+ v6+    
    Firefox v3.5+ v3.5+ v3.6+    
    Internet Explorer v9   v9 v9 v4+
    Opera v10+ v10+   v10+  
    Safari v3.1+
    v3.1+
      SafariMobile iOS 4.1+  

    Licensing Support Today

    Before you can dive into the wonderful world of @font-face, you first need to choose a font. And not just any font, but a font that is licensed for web use.

    There are many free and commercial fonts that have web licenses, making it easy for you. Some of the top sites when looking for a properly licensed font:

    Some fonts, you can download and host yourself, others you can only use hosted versions. If you already have a font that is licensed for web use, for example, you can save it to your web server and reference those files in your CSS.

    Looking for more information and some sample code? Have a look at this post by Emily Lewis.

    Tuesday, March 1, 2011

    Enable Search indexing on your TFS Sharepoint Team Site

    A few weeks ago, one of my customers sent me the following screenshot mentioning that they couldn’t search for information on their Team Foundation Server Sharepoint site.

    image

    In order for search to work you do need to assign search to an indexer. This typically means specifying the server name where the indexing service is running. If you have a single-server installation then that single server of course.

    • Open the Sharepoint Central Administration website.
    • Go to Operations
      • Click on Services on Server under Topology and Services
      • Click on Start next to the Windows Sharepoint Services Search
        • Enter a Service Account
        • Enter a Content Access Account
        • Leave the default Search Database selected(or select a new one)
        • Configure the indexing Schedule
        • Click Start
    • Go to Application Management
      • Click on Content databases under Web Application Management
      • Click on the Content database
        • Select the server under Search Server
        • Click OK to apply the changes