Thursday, June 30, 2011

Exclude files from the JSLint validation process

JSLint is a must have tool for every web developer. It’s a JavaScript program that looks for problems in your JavaScript code.

JSLint uses a professional subset of JavaScript, a stricter language than that defined by Third Edition of the ECMAScript Programming Language Standard. The subset is related to recommendations found in Code Conventions for the JavaScript Programming Language.

The first time I ran it on my brand new ASP.NET MVC application, I was confronted with a lot of errors, most of them related to external plugins and the jquery library. As I am not planning to fix stuff in jquery, I needed a way to tell JSLint to ignore some files.

Turns out that this is really easy:

image

Wednesday, June 29, 2011

Free TFS 2010 and Visual Studio 2010 Webcasts

Too hot to go outside? The upcoming weeks you don’t have to. Just stay inside and spend some time looking at the following webcasts:

Testing SharePoint - Functional and Performance Testing for SharePoint based applications using Visual Studio 2010

June 29, 2011

During this free one hour webcast we will we'll demonstrate the SharePoint specific testing capabilities of Microsoft Visual Studio 2010 and Team Foundation Server.

More and more SharePoint is being leveraged as an application platform within organizations for both internal and external systems. Visual Studio 2010 and Team Foundation Server 2010 include features specifically designed for testing SharePoint based development efforts.

· Discover new Unit Testing features designed for use in SharePoint projects

· Witness the use of IntelliTrace for advanced debugging of SharePoint for both developers and testers to help find root cause and create actionable bugs

· See how Microsoft Test Manager can be used to manage testing efforts for SharePoint based applications

· Experience creation of Coded-UI automation for testing SharePoint based apps

· Learn how to create Performance Tests in Visual Studio 2010’s Web Test recorder with functionality specifically designed to deal with SharePoint GUIDs and other characteristics of the platform

· Learn best practices for testing SharePoint 2010 using Visual Studio

SCRUM and Microsoft ALM: Using Team Foundation Server 2010 and the Microsoft SCRUM Process Template

July 6, 2011

During this free one hour webcast we will demonstrate the capabilities and use of the new Microsoft SCRUM MSFT template for Team Foundation Server 2010.

In 2010 Microsoft released a template for Team Foundation Server 2010 Team Projects designed for use in the SCRUM development methodology. This free template enables organizations practicing SCRUM or a variant thereof to manage TFS Team Projects with minimal customization.

· Discover the features on the new MSFT SCRUM template and how they could help your Application Lifecycle Management process

· Learn the new work item types included in the SCRUM template such as Sprint and Impediment

· Explore the differences between the SCRUM template and the traditional MSFT Agile template

· See the new workflow included in the template and how it fosters the use of this template for true SCRUM methodology

· See the new reports included in the template for SCRUM projects

· Witness the creation of a SCRUM Team Project and a scenario for use

Customizing Team Foundation Server 2010 Team Projects - Make TFS work for you

July 13, 2011

During this free one hour webcast we will demonstrate the customization capabilities for Team Foundation Server 2010. Team Foundation Server 2010 has a vast array of options for customizing templates, work item types, fields, workflow, reports, notifications, security, builds and more. To implement these options, several tools and techniques are applied. Customizing TFS allows an organization to make the tool work for their specific software development and testing process, rather than having to modify processes to use the tool.

· Imagine fully customizing Team Foundation Server to work with your Application Lifecycle Management process

· Discover the points of customization within TFS

· Learn how to create a custom process template and use this as the basis for your team projects

· Explore custom work item types and learn how to create and share them across projects

· See the creation of new fields on work item forms

· Witness the capabilities to make fields dynamic using workflow and the Process Explorer

· See custom build workflow in action

· Customize alerts to suit your needs for this collaborative environment

Team Foundation Server 2010 Power Tools: An Overview

July 20, 2011

During this free one hour webcast we will demonstrate and outline the features contained in the Team Foundation Server 2010 Power Tools package. Power Tools have traditionally been the way Microsoft adds incremental functionality to Team Foundation Server. Keeping up with the capabilities and implementing Power Tools can give your organization that critical piece of functionality you need to improve efficiency and accomplish your Application Lifecycle Management goals. This presentation outlines the new features included in the initial and September 2010 release of TFS Power Tools.

· See the new Alerts Explorer for customizing alerts based on TFS events

· Explore the new backup power tool and restore wizard for backing up the critical database components of TFS 2010 and restoring via a GUI

· Learn how to apply the Best Practices Analyzer to help create the most efficient TFS ALM process

· Imagine the ability to create additional Check-In Policies with the Check-In policy pack

· Use the Process Explorer to explore and customize project workflow

· See enhancements to Team Explorer including labels and wildcards

· Witness the capabilities of the new Team Foundation Server command line power tool in action

· Explore new PowerShell cmdlets, Windows Shell Extensions and Work Item types

Tuesday, June 28, 2011

Best practices to speed up your website

As mentioned by Jeff Atwood, ‘performance is a feature’. Having a fast and responsive web application really makes the difference. But how do you get there?

The best place to start is to have a look at the performance rules list created by the Exceptional Performance team at Yahoo. They have identified a number of best practices for making web pages fast. The list includes 35 best practices divided into 7 categories.

  • Content
    • Minimize HTTP requests
    • Reduce DNS lookups
    • Avoid redirects
    • Make Ajax cacheable
    • Post-load components
    • Preload components
    • Reduce the number of DOM elements
    • Split components across domains
    • Minimize the number of iframes
    • No 404s
  • Server
    • Use a Content Delivery Network
    • Add an Expires or a Cache-Control Header
    • Gzip components
    • Configre ETags
    • Flush the buffer early
    • Use GET for AJAX requests
    • Avoid empty image src
  • Cookie
    • Reduce cookie size
    • Use cookie-free domains for components
  • CSS
    • Put stylesheets at the top
    • Avoid CSS expressions
    • Make CSS external
    • Minify CSS
    • Choose <link> over @import
    • Avoid filters
  • JavaScript
    • Put scripts at the bottom
    • Make JavaScript external
    • Minify JavaScript
    • Remove duplicate scripts
    • Minimize DOM access
    • Develop Smart Event handlers
  • Images
    • Optimize images
    • Optimize CSS sprites
    • Don’t scale images in HTML
    • Make favicon.ico small and cacheable
  • Mobile
    • Keep components under 25K
    • Pack components into a multipart document

There’s also a tool available called YSlow which helps you to analyze web pages and suggests ways to improve their performance.

Monday, June 27, 2011

Windows Azure: Free Ingress for all Windows Azure Customers

Last week Microsoft announced a change in pricing for the Windows Azure platform. For billing periods that begin on or after July 1, 2011, all inbound data transfers for both peak and off-peak times will be free. This will provide significant cost savings for customers whose cloud applications experience substantial inbound traffic, and customers interested in migrating large quantities of existing data to the cloud.

Thank you, Microsoft.

Friday, June 24, 2011

Impress your colleagues with your knowledge about… Visual Studio wildcards support

Last week I discovered a useful feature in Visual Studio when you have lots of resources. It’s possible to use wildcards to include files in a project. Imagine you have a lot of files you want to add to your project, you can add them all manually or  you can edit the project file once to pick up all of the files and never worry about it again.

How do you this?

  • Right click on the project in Visual Studio and choose ‘Unload Project’.
  • Right click on the unloaded project and choose ‘Edit Project’
  • Add an <ItemGroup/> block with the following:
<ItemGroup> 
<Content Include="Content\Images\*.png"> 
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> 
</Content> 
</ItemGroup>
  • Save and reload the project.  All files will be self-included.

I know that by clicking “Show all” and selecting the files you want to add you get the same behaviour, but it’s still neat. Smile

Thursday, June 23, 2011

TFS Build: Cannot add custom activity to build workflow

After creating a custom Workflow 4 activity, I was planning to add it to our existing build process  template. However after adding the activity to the toolbox in Visual Studio, I couldn’t drop the activity on the workflow designer.

The reason is that the workflow designer is not able to find the assembly containing the custom activity. There are multiple solutions out there to solve this problem. Probably the easiest one is adding the assembly to the GAC. Two other alternatives are:

  1. Adding the process template to a project and reference the assembly from this project.
  2. Adding the assembly to a location where Visual Studio will find it (e.g.  ..\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PublicAssemblies).

If you want to know more about customizing the build process template, I can recommend the following blog series by Ewald Hofman:

  1. Part 1: Introduction
  2. Part 2: Add arguments and variables
  3. Part 3: Use more complex arguments
  4. Part 4: Create your own activity
  5. Part 5: Increase AssemblyVersion
  6. Part 6: Use custom type for an argument
  7. Part 7: How is the custom assembly found
  8. Part 8: Send information to the build log
  9. Part 9: Impersonate activities (run under other credentials)
  10. Part 10: Include Version Number in the Build Number
  11. Part 11: Speed up opening my build process template
  12. Part 12: How to debug my custom activities
  13. Part 13: Get control over the Build Output
  14. Part 14: Execute a PowerShell script
  15. Part 15: Fail a build based on the exit code of a console application

Wednesday, June 22, 2011

Automating your deployment process using Team Foundation Server

You already have a build server up and running? Your code gets compiled and tested in a continuous way? You now want to take this one step further and automate your deployment? Let me introduce you TFS Deployer:

TFS Deployer is a free tool available at Codeplex. It must be installed as an agent in your test and production environments and supports the execution of PowerShell scripts when an event happens in TFS. In particular it listens to build quality change notifications which occur when you change the quality of a build under the Team Builds node of Team Explorer.

How does it work?

The way it works is that when TFS Deployer starts up, the service subscribes to the build quality change event notification (1). Then the release manager, using Team Explorer updates the build quality indicator in the build store via Team Explorer (2), the build quality change event is fired (3), the TFS event service then looks up who is subscribed to the event (4) and then TFS Event Service notifies the subscribers of the Build Quality Change Event - in this case TFS Deployer is notified (5). You can have one or more TFS Deployer installations listening across multiple target machines. When TFS Deployer is notified it doesn’t initially know whether it needs to do anything. To determine this it grabs a deployment mapping file from a well known location in the version control store (6).

The deployment mapping file lists out each of the servers and what that server should do in the event that a build is transitioned from quality state to another. What happens is encapsulated in a PowerShell script file (*.ps1) and once that script file is identified TFS Deployer will instantiate a PowerShell host environment and execute the script (7).


TFSDeployerDiagram.gif

Tuesday, June 21, 2011

Integrating Microsoft Dynamics AX with Team Foundation Server

If you are interested to know how to set up the version control system for Microsoft Dynamics AX 2009 using Microsoft Visual Studio Team Foundation Server, I’ll recommend having a look at the following whitepaper: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=9915.

This white paper contains procedures to install and configure version control for Microsoft Dynamics AX 2009 using Microsoft Visual Studio Team Foundation Server. This paper is designed to be used in conjunction with the Team Server (ID Server) Setup white paper for Microsoft Dynamics AX 2009 and the Team Foundation Installation Guide for Visual Studio Team System.

Monday, June 20, 2011

Silverlight vs HTML 5

The last few months, a discussion is going on about what’s the best technology for the future, Silverlight or HTML5. This discussion was further encouraged by the recent announcements about Windows 8 focussing on HTML 5 and Javascript. I don’t want to start the discussion again but if you have to make a decision today, this is a post that can help you compare both technologies:

SVSH

Friday, June 17, 2011

Generate an MDX query from Microsoft Excel

Writing your own MDX queries can be a painful experience. So most of the time I cheat Smile.

I’ll use Excel to connect to the Analysis Services server and use PivotTables to drag and drop the expected result together. But how can you get from this PivotTable to the corresponding MDX query that is used underneath?

One way to do this is through the OLAP PivotTable Extensions:

OLAP PivotTable Extensions is an Excel 2007 and Excel 2010 add-in which extends the functionality of PivotTables on Analysis Services cubes. The Excel API has certain PivotTable functionality which is not exposed in the UI. OLAP PivotTable Extensions provides an interface for some of this functionality. It also adds some new features like searching cubes, configuring default settings, and filtering to a list in your clipboard.

A screenshot:

AvgTaxMDX.png

Thursday, June 16, 2011

Run MDX queries from .NET

SQL Server Analysis services offers a powerful query language on top of your datawarehouse called MDX. But how do you use these MDX queries inside your .NET application?

You cannot use your normal ADO.NET classes, instead you have to use the Microsoft.AnalysisServices.AdomdClient.dll.

You can download the file from here: http://www.microsoft.com/downloads/details.aspx?FamilyId=228DE03F-3B5A-428A-923F-58A033D316E1&displaylang=en

The code itself is somewhat similar:

using (AdomdConnection conn = new AdomdConnection("Data Source=tfsDB;Initial Catalog=Tfs_Analysis; MDX Compatibility=1;"))
{
conn.Open();
var mdxQuery = new StringBuilder();
mdxQuery.Append("WITH ");
mdxQuery.Append("SET [Last 4 weeks] as Filter([Date].[Date].[Date], [Date].[Date].CurrentMember.Member_Value < Now() AND [Date].[Date].CurrentMember.Member_Value >= DateAdd(\"d\", - 28, Now())) ");
mdxQuery.Append("SELECT NON EMPTY Hierarchize(AddCalculatedMembers({DrilldownLevel({[Work Item].[System_WorkItemType].[All]})})) DIMENSION PROPERTIES PARENT_UNIQUE_NAME,HIERARCHY_UNIQUE_NAME ON COLUMNS , NON EMPTY {Hierarchize(Distinct({[Last 4 weeks]}))} DIMENSION PROPERTIES PARENT_UNIQUE_NAME,HIERARCHY_UNIQUE_NAME ON ROWS ");
mdxQuery.Append("FROM (SELECT ({[Work Item].[System_WorkItemType].&[Requirement], [Work Item].[System_WorkItemType].&[Change Request]}) ");
mdxQuery.Append("ON COLUMNS FROM [Team System]) WHERE ([Work Item].[Iteration Hierarchy].[All],[Test Case].[System_WorkItemType].[All],[Work Item].[System_State].&[Active],[Measures].[Work Item Count]) ");

using (AdomdCommand cmd = new AdomdCommand(mdxQuery.ToString(), conn))
{
DataSet ds = new DataSet();
ds.EnforceConstraints = false;
ds.Tables.Add();
DataTable dt = ds.Tables[0];
dt.Load(cmd.ExecuteReader());
return dt;
}
}

Wednesday, June 15, 2011

C:\fakepath in Microsoft Test Manager

After recording a test using Microsoft Test Manager, I noticed that the test failed when I ran the test again. I could trace the root cause of the issue to a file upload I did during the test.  It seems like that when executing a test case with Microsoft Test Runner with a step where you have to select a file in a web application, you get “C:\fakepath” for the location of the file you selected.

In my case I had to upload a document from the local disk. When I replayed this action I got the c:\Fakepath folder instead. Luckily a solution is available.  

This Fakepath comes from Internet Explorer and is a security feature which hides the real path selected path. The workaround to get the action recording up and running is to add the site to the trusted sites in the security tab of IE options.

Tuesday, June 14, 2011

Azure Throughput Analyzer

When hosting a solution on Windows Azure, you can choose between multiple data centers around the world. Do you want to get an indication which data center gives you the highest throughput?

The Microsoft Research eXtreme Computing Group cloud-research engagement team released a desktop utility that measures the upload and download throughput achievable from your on-premise client machine to Azure cloud storage (blobs, tables and queue).

You simply install this tool on your on-premise machine, select a data center for the evaluation, and enter the account details of any storage service created within it. The utility will perform a series of data-upload and -download tests using sample data and collect measurements of throughput, which are displayed at the end of the test, along with other statistics.

Download the tool here: http://research.microsoft.com/en-us/downloads/5c8189b9-53aa-4d6a-a086-013d927e15a7/default.aspx

Monday, June 13, 2011

ALM Practices every developer should know about

After watching one of Dennis Doomen sessions at NDC, I used the power of google to search his blog. I immediately noticed the following interesting blog series: ALM Practices every developer should know about.

Although he doesn’t bring something new to the table, he put all this information nicely together. Have a look at the different parts:

Saturday, June 11, 2011

Back from NDC 2011

NDC 2011

I’m just back from NDC 2011 and again it was a fantastic conference. With great speakers like Scott Guthrie, Robert C.Martin, Douglas Crockford, my very own colleague Gill Cleeren and many more… the quality of the content was excellent. I learned a lot and went home with a bag of new ideas.

For the people who couldn’t be there, all sessions are recorded and will be put online. I’ll post a link once they are available.

A big thumbs up to the organizers. I’ve already reserved NDC 2012 in my agenda Smile

Friday, June 10, 2011

New Windows Azure Pricing Calculator

As you have a lot of different options, calculating the cost of your Windows Azure solution seems a complex task. To help you select the right Windows Azure platform offer and estimate your monthly costs, Microsoft  launched a new pricing calculator.

 

The pricing calculator lets you pick compute, database, storage, bandwidth, CDN and Service Bus capacity based on your needs. Along with predicting your expected monthly costs, the pricing calculator then recommends the most cost effective offer for you to purchase Windows Azure platform services.

I especially like the support for multiple currencies and the recommendation it does from the long list of offers.

You can access the pricing calculator here: http://www.microsoft.com/windowsazure/pricing-calculator/.

Thursday, June 9, 2011

Entity Framework Connection Strings

One of the annoying things in Entity Framework is that you have to pass an Entity Framework connectionstring instead of a normal connectionstring. By default such a connectionstring is created for you if you are using the Entity Framework Designer. However if you don’t use the designer or start changing some stuff, it’s easy to get into trouble.

Most of the time I end up with building the connectionstring from code:

private static string CreateConnectionString()
{
SqlConnectionStringBuilder sqlBuilder = new SqlConnectionStringBuilder();
sqlBuilder.MultipleActiveResultSets = true;
sqlBuilder.DataSource = "dbserver;
sqlBuilder.InitialCatalog = "db";
sqlBuilder.UserID = "dbuser";
sqlBuilder.Password = "dbpassword";

EntityConnectionStringBuilder entityBuilder = new EntityConnectionStringBuilder();
entityBuilder.ProviderConnectionString = sqlBuilder.ToString();
entityBuilder.Metadata = "res://*/";
entityBuilder.Provider = "System.Data.SqlClient";

return entityBuilder.ToString();
}
Metadata

One important thing to notice here is the metadata parameter. This parameter tells the Entity Framework where to find your EDMX at runtime. When your application is compiled, the EDMX is split into three parts: CSDL, MSL, and SSDL. The EDMX can be supplied to the application as embedded resources or files on disk.

I’m specifying the metadata by using a *.  This is the simplest approach to a connection string. It will probably fail if your resources don’t happen to have the same name as your model, or if the assembly doesn’t happen to be loaded.

If you want to know more about this metadata attribute and the values it expects, I recommend reading the following blog post: http://blogs.teamb.com/craigstuntz/2010/08/13/38628/

Wednesday, June 8, 2011

Test Attachment Cleaner

After starting to use the Microsoft Test Manager tools in Visual Studio 2010, I noticed that our databases started growing a lot faster. The reason for this is that the execution of a Test Run in Team Foundation Server 2010  generates a bunch of diagnostic data, for example, IntelliTrace logs (.iTrace), Video log files, Test Result (.trx) and Code Coverage (.cov) files. The downside of these rich diagnostic data captures is that the volume of the diagnostic data, over a period of time, can grow at a rapid pace. The Team Foundation Server administrator has little or no control over what data gets attached as part of Test Runs. There are no policy settings to limit the size of the data capture and there is no retention policy to determine how long to hold this data before initiating a cleanup.

A few months ago, the Test Attachment Cleaner for Visual Studio Ultimate 2010 & Test Professional 2010 power tool was released which addresses these shortcomings in a command-line tool.

Using the tool is a 2 step process.

First you have to create a settings file:

<!-- View/Delete all attachments on test runs older than 6 months, that are not linked to active bugs --> 
<DeletionCriteria> 
<TestRun> 
<AgeInDays OlderThan="180" />
</TestRun> 
<Attachment /> 
<LinkedBugs>     
<Exclude state="Active" /> 
</LinkedBugs> 
</DeletionCriteria>

Than you can execute the Test Attachment Cleaner with the following command:

MPT.exe attachmentCleanup /collection:http://localhost:8080/tfs/CollectionName /teamproject:TeamProjectName /settingsFile:OlderThan6Months.xml /mode:delete

More information here: http://blogs.msdn.com/b/granth/archive/2011/02/12/tfs2010-test-attachment-cleaner-and-why-you-should-be-using-it.aspx

Tuesday, June 7, 2011

Sharing Windows Azure Drives

One question I get a lot from customers is how to share a Windows Azure drive with read-write access among multiple role instances. At first you should think this is possible, but this is not the case. Windows Azure drives does not behave like a normal drive. It uses a concept of leasing, meaning that only one role instance can have write access to a drive. Drives can only be shared between multiple role instances as a read-only snapshot. This of course limits the usability of drives in our cloud applications.

However the Windows Azure Storage team created a blog post explaining an alternative solution allowing you to still share a drive for read-write access. In this solution they are using SMB(Server Message Block) the same technology used to implement the Shared Folders/Printers/… functionality in Windows.

Monday, June 6, 2011

Could not load file or assembly ‘XamlServices’

Last week I did some test with the new Silverlight 5 functionality. Afterwards I reverted the solution back to Silverlight 4. But from that moment on, compilation started to fail with the following error messages:

SL5Error

The strange thing was that all the necessary references were available. After trying a lot of things, I was able to solve the problem in the end by removing my userfile (*.suo) from the solution directory. Visual Studio recreated the file the next time I opened the solution and everything was working again.

Saturday, June 4, 2011

Exporting Fiddler Web Recording to a Visual Studio Web Test

Fiddler is a required tool in the toolset of every web developer. It allows you to exam and work with HTTP requests. Although I’m using Fiddler for a long time, I still discover new features every day. As described here, one of the features I didn’t know is that you can export a Fiddler session as a Visual Studio Web Test (or Web Performance Test).

So how do you achieve this?

  • Open up the site you want to record and make a few requests.
  • Go to File – Export Sessions – All Sessions.
  • Select the Visual Studio WebTest export format.
  • Save the file.
  • Open a Visual Studio Test Project.
  • Choose Add Existing Item and add the file you just saved.

Friday, June 3, 2011

OData improvements for Windows Phone 7

Last week the Windows Phone team announced the beta release of Mango. It contains a long list of exciting new features but one that I was especially interested in are the OData improvements. As we use OData for our TFS Monitor application, I’m happy to see how much easier my life will become. Kind of sad that I have to wait until the fall before I can add this functionality to my application Sad smile

odata

Oh, before I forget, here is the list with the upcoming OData improvements:

  • OData Client Library for Windows Phone Is Included in the Windows Phone SDK
    In this release, the OData client library for Windows Phone is included in the Windows Phone Developer SDK. It no longer requires a separate download.
  • Add Service Reference Integration
    You can now generate the client data classes used to access an OData service simply by using the Add Service Reference tool in Visual Studio. For more information, see How to: Consume an OData Service for Windows Phone.
  • Support for LINQ
    You can now compose Language Integrated Query (LINQ) queries to access OData resources instead of having to determine and compose the URI of the resource yourself. (URIs are still supported for queries.)
  • Tombstoning Improvements
    New methods have been added to the DataServiceState class that improve performance and functionality when storing client state. You can now serialize nested binding collections as well as any media resource streams that have not yet been sent to the data service.
  • Client Authentication
    You can now provide credentials, in the form of a login and password, that are used by the client to authenticate to an OData service.

For more information on these new OData library behaviors, see the topic Open Data Protocol (OData) Overview for Windows Phone in the Mango beta release documentation.

Thursday, June 2, 2011

WebActivator: How to control the order of execution?

One of the libraries we use during web development is WebActivator. WebActivator was introduced with NuGet to solve the following problem. 

With the current NuGet version it is not possible to alter existing code. This means that if you have a NuGet package that requires some extra coding executed (for example in the global.asax), that you need to add this extra code yourself, breaking the nice experience that NuGet gives you. To solve this issue, David Ebbo introduced WebActivator. This library allows you to add startup code in separate files instead of having to add it to your global.asax. This overcomes the current limitation of NuGet.

To use this library create an App_Start folder. Add a class file to this folder and include the following code:

using System;
[assembly: WebActivator.PreApplicationStartMethod(typeof(App_Start.MyApp), "PreStart")]
namespace App_Start 
{    
public static class MySuperPackage 
{
public static void PreStart() 
{            
// Add your start logic here
}    
}
}

And that's it! Now your code will be called early on when the app starts.

After adding a few of this Start classes, we hit a problem. Some of these start scripts had a dependency on each other, meaning that the order of execution did matter. It took us some time to discover the following feature:
Ability to run code after Application_Start

When you use a WebActivator PreApplicationStartMethod attribute, the method it points to runs before your global.asax’s.  There is an alternative attribute that runs after the very first HttpModule get initialized, called PostApplicationStartMethod!

[assembly: WebActivator.PostApplicationStartMethod(typeof(App_Start.MyStartupCode), "CallMeAfterAppStart")]

After changing some of the classes to the PostApplicationStartMethod,  everything worked.

Wednesday, June 1, 2011

Maybe it’s time to learn functional programming

After 5 years of object oriented thinking, I’m trying to wrap my head around the functional style of programming.  One of the things I had to read over and over before I started to understand them was the concept of Monads. Monads are everywhere in .NET now and it is a foundational concept in many of the really useful libraries and APIs that are coming out of Microsoft (LINQ, Task Parallel Library (TPL), Reactive Extensions (Rx), etc.). One of the simplest implementations of Monads is the Maybe monad.

To help you understand this I took the Maybe<T> implementation from this post by Jordan Terrell. It works with all .NET types; value and reference types.

Let’s have a look at the following simple sample:

Maybe<string> text = Maybe.Value("Hello, World!");   
if (text.HasValue) 
{ 
Console.WriteLine(text.Value); 
}   
text = Maybe<string>.NoValue;

At first this seems fairly similar to Nullable<T> with as only difference that it also works for reference types… and you’re right. But using the Maybe monad can lead to a far more elegant solution. One of the advantages of the Maybe monad is that you will never get a null reference exception - it is implemented as a value type (struct) and as such cannot be null. Being able to throw away all my null reference checks throughout my code already makes it worthwhile.

Things get interesting the moment you’re start chaining methods together like in LINQ:

Maybe<Category> parentCategory = Maybe.Value(product)   
.Select(x => x.Parent);

By using the Maybe monad we can just take the value, if there is one, and selects another value from it. No null checking and no nasty null reference exceptions if the product object didn’t exists.

What’s important to understand? The Maybe monad is a very clean and elegant implementation of the Null object pattern.