Wednesday, December 29, 2010

TF266044: One or more machines are not ready to run workflows.

When configuring a new lab environment for a client, I encountered the following error:

Environment message: Type=Error; Message=TF266044: One or more machines are not ready to run workflows. For more information, see the individual machine errors.;

Machine messages:

Machine name: XXXXXX

Machine message: Type=Error; Message=Error occurred while configuring TFSBuildServiceHost with Lab configuration. ExceptionType:Microsoft.TeamFoundation.Build.Client.BuildServiceHostAlreadyExistsException. ExceptionMessage: A build service host already exists for computer Specify a different computer name and try again.

I found the solution in the following blog post ( Although our TFSenvironment wasn’t upgraded from the Beta 2, the fix still works if you encounter this error message.

Tuesday, December 28, 2010

Export mapping files generated by Fluent NHibernate

If you are using  NHibernate, you probably also heard about Fluent NHibernate which allows you to create your object-relational mapping from a mix of conventions and code instead of using error-prone XML files. Although you’re using a completely different API under the hood the same old XML files still exist. Sometimes it would be handy to view the NHibernate mapping files generated by Fluent NHibernate.It turns out that this is quite easy and you will just need to add a call to the method ExportTo() and then the mappings will be created in the location specified.

   1:  private static ISessionFactory CreateSessionFactory() 
   2:  { 
   3:     string outputdir= Path.Combine(System.Environment.CurrentDirectory, "Mappings"); 
   5:      return Fluently.Configure() 
   6:        .Database(SQLiteConfiguration.Standard.InMemory().ShowSql()) 
   7:        .Mappings(M => M.FluentMappings.AddFromAssemblyOf<SessionFactory>() 
   8:        .ExportTo(outputdir))
   9:        .BuildSessionFactory(); 
  10:  }

Remark: Make sure that the directory exists where you want to output the generated mapping files.

Monday, December 27, 2010

Distributing check-in policies across your team

One of the nicest features of Team Foundation Server are the check-in policies. They allow you to validate specific rules before each check-in. What makes this even nicer is that you can easily write your own check-in policies. However there is one problem, the check-in policy(read: DLL) needs to be installed on the developer machine. This is important because if developers do not install check-in policies they would not be run during a Check-in.

Before you start inventing the most complex deployment strategies to get the latest version of these policies on the developer machines, note that there is already a feature available, included in the TFS power tools.

After installing power tools, a new node is added in Team Explorer called Team Members. Right click in the Team Members node and choose Personal Settings.

From there you can see all the options for the collaboration features and one interesting option called Install downloaded custom components.

This extension verifies in the source control the presence of a path called $projectname/TeamProjectConfig/CheckinPolicies and inside that folder looks for 1.0 2.0 or 3.0 folders for VS2005 VS2008 and VS2010 addins respectively. Every dll found on that directory is automatically downloaded to the right location and made available to client computers.

Remark: You can also include dll’s containing custom controls for work item editing, they should be included in a folder called $projectname/TeamProjectConfig/CustomControls.

Sunday, December 26, 2010

Impress your colleagues with your knowledge about… PDB files

Most developers know that PDB files help you in some way with debugging, but that's about it. They are a dark art for most developers only completely understand by a few evil magicians. Let me help you understand what PDB files are and how they can help you making your debugging experience a lot easier. First read the following 3 important rules and never forget them!

Rule 1 – PDB files are as important as source code

First and foremost PDB files are as important as source code! Debugging bugs on a production server without finding the matching PDB files for the deployed build can cost you tons of money. Without the matching PDB files you just made your debugging challenge nearly impossible.

Rule 2 – As a development shop, I should have a Symbol Server

At a minimum, every development shop must set up a Symbol Server. A Symbol Server stores the PDBs and binaries for all your public builds. That way no matter what build someone reports a crash or problem, you have the exact matching PDB file for that public build the debugger can access. Both Visual Studio and WinDBG know how to access Symbol Servers and if the binary is from a public build, the debugger will get the matching PDB file automatically.

Rule 3 – A Source Server is a Symbol Server best friend

A Symbol Server is not that useful without one extra step. That step is to run the Source Server tools across your public PDB files, which is called source indexing. The indexing embeds the version control commands to pull the exact source file used in that particular public build. Thus, when you are debugging that public build you never have to worry about finding the source file for that build. If you are using TFS 2010, out of the box the Build server will have the build task for Source Indexing and Symbol Server copying as part of your build enabled.

Now you know these 3 rules, let’s have a look at the PDB file itself. A .NET PDB only contains two pieces of information, the source file names and their lines and the local variable names. All the other information is already in the .NET metadata so there is no need to duplicate the same information in a PDB file.

When you load a module into the process address space, the debugger uses two pieces of information to find the matching PDB file. The first is obviously the name of the file. If you load ABC.DLL, the debugger looks for ABC.PDB. The extremely important part is how the debugger knows this is the exact matching PDB file for this binary. That's done through a GUID that's embedded in both the PDB file and the binary. If the GUID does not match, no debugging at source code level is possible.

With the knowledge of how the debugger determines the correctly matching PDB file, the last question that remains is where the debugger looks for the PDB files. You can see all of this order loading yourself by looking at the Visual Studio Modules window, Symbol File column when debugging. The first place searched is the directory where the binary was loaded. If the PDB file is not there, the second place the debugger looks is the hard coded build directory embedded in the Debug Directories in the PE file. If the PDB file is not in the first two locations, and a Symbol Server is set up for the on the machine, the debugger looks in the Symbol Server cache directory. Finally, if the debugger does not find the PDB file in the Symbol Server cache directory, it looks in the Symbol Server itself.

I hope this information helped you understand PDB files and hopefully you start understanding and using their full potential.

Friday, December 24, 2010

WCF vNext: Linq to WCF

One of the great new features coming to WCF vNext is the ability to expose a service through an IQueryable interface. This introduces the rich query model of OData to your own WCF services. How does this work?

Making the service queryable

On the server side, your service operation should return an IQueryable<T>. Annotate the operation with the new [QueryComposition] attribute. Once you do that, your service becomes queryable using the OData uri format.

   1:  [WebGet(UriTemplate = "")]
   2:  [QueryComposition]
   3:  public IQueryable<Customer> Get()
   4:  {   
   5:     return customers.AsQueryable();
   6:  }

The Get method above returns an IQueryable of customers. With the query composition enabled, the host will now accept requests like “http://localhost/customers?$filter=Countrye%20eq%20Belgium” which says “find me all the customers from Belgium”.

Querying the service, LINQ to WCF

On the client side Microsoft added a CreateQuery<T> extension method which you can use with the new HttpClient to create a WebQuery<T>. Once you have that query, you can then apply a Where, or an Order by. Once you start to iterate through the result, we will automatically do a Get request to the server using the correct URI based on the filter. The results will come back properly ordered and filtered based on your query.

Below is a snippet that shows querying our previously created Customer resource:

   1:  public IEnumerable<Customer> GetBelgianCustomers()
   2:  {    
   3:     var address = "http://localhost/customers";    
   4:     var client = new HttpClient(address);    
   5:     var customers = client.CreateQuery<Customer>();
   7:     return customers
   8:                   .Where(c=>c.Country == "Belgium")
   9:                   .OrderBy(c=>c.CustomerName);    
  10:  }

Thursday, December 23, 2010

Boost the performance of your ASP.NET and WCF applications

By default both IIS and WCF are somewhat restrictive in their default settings. Executing a large number of concurrent call’s will not have much effect as by default only 2 concurrent connections per IP are allowed.

However there are some simple configuration changes that you can make on machine.config and IIS to give your web applications significant performance boost. These are simple harmless changes but makes a lot of difference in terms of scalability.

To learn how to do this, read this article “Quick ways to boost performance and scalability of ASP.NET, WCF and Desktop Clients” written by Omar Al Zabir.

Wednesday, December 22, 2010

ThreadStatic and ThreadLocal<T>

For a long time I was using the Thread­Sta­tic attribute to make the value of a sta­tic or instance field local to a thread (i.e. each thread holds an inde­pen­dent copy of the field). Although this did the trick for a long time, the ThreadStatic attribute had some disadvantages:

  • the Thread­Sta­tic attribute doesn’t work with instance fields, it com­piles and runs but does nothing..
  • fields always start with the default value

With the release of C#  4 Microsoft intro­duced a new class specif­i­cally for the thread-local stor­age of data – the ThreadLocal<T> class:

   2:  ThreadLocal<int> _localField = new ThreadLocal<int>(() => 1); 

So why should you choose the ThreadLocal<T> class?

  • Thanks to the use of a factory function, the val­ues are lazily eval­u­ated, the fac­tory func­tion only executes on the first call for each thread
  • you have more con­trol over the ini­tial­iza­tion of the field and you are able to ini­tial­ize the field with a non-default value

Tuesday, December 21, 2010

How to be a bad programmer?

People always learn the most from their mistakes. So when talking about what defines a good programmer versus what defines a bad programmer, it’s sometimes interesting and easier to discuss what makes someone a bad programmer instead of a good one.

Giorgio Sironi wrote a great article about “How to be a worse programmer?”

A must read!


Monday, December 20, 2010

Silverlight 5: Microsoft’s answer on the Silverlight is dead discussion

The last few weeks, there was a lot of buzz around the future of Silverlight. Although there were some official comments, the rumors kept going. For the remaining skeptics,  what can be a better answer than the announcement of Silverlight 5.

A t the Silverlight FireStarter event Microsoft announced the timeline for Silverlight 5 in 2011.  Silverlight 5 was the main subject of  Scott Guthrie’s keynote where Microsoft demoed many of the coming new features and capabilities.  Silverlight 5 will be in beta the first half of 2011 and ship early in the second half of 2011.

Some of the impressive improvements(note especially XAML debugging):

Silverlight 5 Media improvements:

  • Hardware Decode & Presentation of H.264 performance improvements using GPU support
  • Trickplay with fast-forward and rewind support w/normal audio pitch
  • Improved power awareness
  • Remote-control support
  • Digital Rights Management advancements

Application Development improvements:

  • Smoother UI experiences with smoother animation
  • Text improvements
    • Multi-column text & linked container text
    • Text clarity improved
    • OpenType support enhanced
  • Support for Postscript vector printing
  • Added support for double-click and combobox
  • MVVM and Databinding enhancements
  • Networking and WCF enhancements
    • Reduced network latency using a background thread
    • WS-Trust support
  • Performance Improvements
    • XAML parser improvements
    • Support for 64-bit OSes
  • Graphics Improvements
    • GPU API
    • Direct rendering on GPU
    • Hardware acceleration on Internet Explorer 9
  • New class of trusted applications
    • Host HTML content as a browser control
    • Read/Write to users My Documents folder
    • Launch Microsoft Office and other programs
    • Ability to call into application COM components gaining access to system capabilities and devices
    • Full keyboard support in full screen
    • Call unmanaged code with PInvoke
  • Out-of-browser trusted applications enhancements
    • Call unmanaged code with PInvoke
    • Child Windows support
  • Tool improvements
    • Visual Studio profiling support for CPU, memory, thread contention
    • Visual Studio Team Test support

Sunday, December 19, 2010

Hostname can't support more than 1 level subdomain.

Last week I wanted to test some new Windows Azure Servicebus functionality. So I started by creating a simple WCF service to host on the cloud. After configuring my service settings in the web.config, I started the service and was confronted with the following error message:

Hostname can't support more than 1 level subdomain.

It took me some time to figure out the root cause of the problem. I had created a namespace on After checking with Fiddler what was going on I realized that although I was using the appfabriclabs environment, the authentication was still passing on to with the error message above as a consequence.

After creating a service namespace through the application ran successfully.

Saturday, December 18, 2010

Using XML namespaces in WPF

When referencing controls from another assembly in XAML, you probably use the xmlns:myAlias="clr-namespace:MyNamespace;assembly=MyAssembly" syntax.

Last week I discovered you also have another option, you can use an Uri instead of a namespace reference thanks to the XmlnsDefinition attribute. (Read more about this attribute on MSDN). It allows you to map a XAML namespace to one ore more assembly namespaces.

So how do you use it?

  1. Open the AssemblyInfo.cs file under the Properties folder of your project.
  2. Add the following line for each namespace in your assembly you want to map:
   1:  [assembly: AssemblyTitle("WPF Namespace Sample")]
   2:  [assembly: AssemblyDescription("")]
   3:  [assembly: AssemblyConfiguration("")]
   4:  [assembly: AssemblyCompany("")]
   5:  [assembly: AssemblyProduct("WPF Namespace Sample")]
   6:  [assembly: AssemblyCopyright("Copyright ©  2010")]
   7:  [assembly: AssemblyTrademark("")]
   8:  [assembly: AssemblyCulture("")]
   9:  [assembly: XmlnsDefinition("http://myApp/schemas/2010/xaml", "WPF.Samples.Controls")]
  10:  [assembly: XmlnsDefinition("http://myApp/schemas/2010/xaml", "WPF.Samples.Commands")]

You can use this reference then inside another project:

   1:  <UserControl    
   2:  xmlns=""   
   3:  xmlns:x=""   
   4:  xmlns:wpfSample="http://myApp/schemas/2010/xaml"     >

Friday, December 17, 2010

Build times for TFS Team Build

After creating an application to monitor builds over multiple Team Projects,  the customer came back with a second request. As they had the feeling that some builds took a long time to complete they asked us to update the application to include build timings.

A colleague took my application and extended it with some extra code. Full code below:

   1:  class Program
   2:  {
   3:     static void Main(string[] args)
   4:     {
   5:        // The url to the tfs server 
   6:        Uri tfsUri = new Uri("<TFS URL>");
   7:        TfsTeamProjectCollection tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(tfsUri);
   8:        IBuildServer bs = tfs.GetService<IBuildServer>();
   9:        WriteQueuedBuilds(bs);
  10:        Console.ReadLine();
  11:     }
  13:     private static void WriteQueuedBuilds(IBuildServer bs)
  14:     {
  15:        IQueuedBuildSpec qbSpec;
  16:        IQueuedBuildQueryResult qbResults;
  17:        qbSpec = bs.CreateBuildQueueSpec("*", "*");
  18:        qbSpec.CompletedWindow = TimeSpan.FromDays(25);
  19:        qbResults = bs.QueryQueuedBuilds(qbSpec);
  21:        Console.WriteLine("Queued Builds");
  22:        Debug.WriteLine("Queued Builds");
  24:        foreach (IQueuedBuild qb in qbResults.QueuedBuilds.OrderByDescending(a=> a.QueueTime))
  25:        {
  26:           string status = qb.Status.ToString();
  27:           string def; 
  28:           if (qb.BuildDefinition!= null)
  29:              def = qb.TeamProject + @"\" + qb.BuildDefinition.Name;
  30:           else if (qb.Build != null)
  31:              def = qb.TeamProject + @"\" + qb.Build.BuildDefinition.Name;
  32:           else
  33:              def = qb.TeamProject + @"\<unknown>";
  35:           string pri = qb.Priority.ToString();
  36:           string datequeued = qb.QueueTime.ToString();
  37:           string requestedBy = qb.RequestedBy;
  38:           string buildDetails = string.Empty;
  39:           string finishTime = string.Empty;
  40:           string starttime = string.Empty;
  41:           if (qb.Build != null)
  42:          {
  43:              if (qb.Build.BuildFinished)
  44:              {
  45:                 buildDetails = "finished " + qb.Build.FinishTime.Subtract(qb.QueueTime).TotalMinutes + " minutes after queue";
  46:                 finishTime = qb.Build.FinishTime.ToString();
  47:              }
  48:              starttime = qb.Build.StartTime.ToString();
  49:           }
  50:           string controller = qb.BuildController.Name;
  52:           if (qb.RequestedBy != qb.RequestedFor)
  53:           {
  54:              requestedBy = qb.RequestedBy + " (for " + qb.RequestedFor + ")";
  55:           }
  57:           Console.WriteLine("{0} {1} {2} {3} {4} {5} {8}", controller, status, def, pri, datequeued, requestedBy, starttime, finishTime, buildDetails);
  58:           Debug.WriteLine("{0}\t{1}\t{2}\t{3}\t{4}\t{5}\t{6}\t{7}\t{8}", controller, status, def, pri, datequeued, requestedBy, starttime, finishTime, buildDetails);
  59:        }
  60:     }
  61:  }

Thursday, December 16, 2010

Windows Azure Platform 30 Day Pass

Microsoft launched a new offer to get started with Windows Azure: the Windows Azure platform 30 day pass. Great news is that no credit card  is required. The only thing you need to do is click on the following link and use the Promo Code to get going:

The Windows Azure platform 30 day pass includes the following resources:

Windows Azure

  • 4 small compute instances
  • 3GB of storage
  • 250,000 storage transactions

SQL Azure

  • Two 1GB Web Edition database


  • 100,000 Access Control transactions
  • 2 Service Bus connections
  • Data Transfers (per region)
  • 3 GB in
  • 3 GB out

Have a cloudy day!

Wednesday, December 15, 2010

Visual Studio 2010 SP1 Beta released

Last week Microsoft released a BETA  version of Visual Studio 2010 Service Pack 1. Although a lot of blog posts mentioned the fact that the beta was released, I didn’t find a lot of information about the exact content and features that will be released in Service pack 1.

So here is an aggregation of some blog posts which together give you a full overview of all the goodness that’s coming:

Tuesday, December 14, 2010

Great Windows Phone 7 productivity story

As a .NET developer by day, choosing for a Windows Phone seems more obvious than choosing an IPhone or Android phone. However if you don’t use Microsoft products every day, the choice is a lot harder.

So if you are a developer planning to create applications on one of the mobile platforms, certainly read this great story comparing development productivity between iOS, Android, WP7 and mobile Web:

Definitely a great proof of the power of the Windows Phone 7 development experience!

Monday, December 13, 2010

New Visual Studio 2010/ TFS 2010 VPC’s available

Most of the time if I have to do a demo about Team Foundation Server I use our own TFS (Test) environment. However some clients want to see some specific functionality or don’t have Internet access available for me. In that case I fall back to the standard TFS 2010 demo VPC’s that Microsoft provides us.

Last week Microsoft released a new version of these VPC’s. This new version contains the latest feature packs, power tools, and Windows Updates. This refreshed VM will stop working on June 1, 2011.

What’s new in this version?

  • Visual Studio 2010 Feature Pack 2
  • Team Foundation Server 2010 Power Tools (September 2010 Release)
  • Visual Studio 2010 Productivity Power Tools · Test Scribe for Microsoft Test Manager
  • Visual Studio Scrum 1.0 Process Template
  • All Windows Updates through December 8, 2010
  • Lab Management GDR (KB983578)
  • Visual Studio 2010 Feature Pack 2 pre-requisite hotfix (KB2403277)
  • Microsoft Test Manager hotfix (KB2387011)
  • Minor fit-and-finish fixes based on customer feedback

Please note that this VM does not include Visual Studio Lab Management 2010 capabilities. The Lab Management team has released a separate VHD which has this capability.

Download links:

Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Windows Virtual PC

Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Windows Server 2008 Hyper-V

Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Microsoft® Virtual PC 2007 SP1

Sunday, December 12, 2010

Prevent hanging build from blocking your TFS build server

Last week I had a question from a customer who complained that all builds on their build server blocked because of one build that sometimes fails. Now the reason why this build failed is something for another blog post. In this post I’ll focus on the way how to prevent that a hanging build keeps blocking your build server.


If you open up your build definition, go to the Process tab and expand the Advanced node, you’ll find the Agent Settings node. If you further expand this node you see that you can  specify the following parameters:

Maximum Execution Time

Type a time span value in hh:mm:ss format. For example, the build will fail with a time-out error if you specify a value of 04:30:15 and the build agent has not completed its work after 4 hours, 30 minutes, and 15 seconds. Specify a value of 00:00:00 if you want to give the build agent unlimited time to process the build. (By default this value is 00:00:00 and this is the reason why a build can keep blocking your build agent).

Maximum Wait Time

Type a time span value in hh:mm:ss format. For example, the build will fail with a time-out error if you specify a value of 01:30:45 and the build has not been assigned to a build agent after 1 hour, 30 minutes, and 45 seconds. Specify a value of 00:00:00 if you want to give the build controller unlimited time to find a build agent to process this build definition. (By default this value is 00:00:00 and this is the reason why a build can keep blocking your build agent).

These 2 settings together define the total amount of time one specific build may take.

Saturday, December 11, 2010

Saving NHibernate objects with assigned id’s

One of the great features of Nhibernate is that it manages persistance for us. You just attach an object to the session and NHibernate will figure out if this object is added or changed. But how does NHibernate knows the difference between a new and existing object?

By default it uses the value we assigned to the unsaved-value attribute on the id mapping. This means that if the id of our object is equal to our unsaved-value that NHibernate will detect this object as new and do an INSERT statement. If the id value is different from our unsaved-value NHibernate will generate an UPDATE statement instead.


   1:  <hibernate-mapping default-cascade="none" xmlns="urn:nhibernate-mapping-2.2">
   2:    <class name="Test.Data.Domain.Category, Test.Data" table="Categories" lazy="true">
   3:      <id name="CategoryID" type="System.Int32" column="CategoryID" unsaved-value="0">
   4:        <generator class="native" />
   5:      </id>
   6:    </class>
   7:  </hibernate-mapping>

Sounds easy but what if you are using a composite key? In that case using the unsaved-value makes no sense. If we have a look at the documentation NHibernate gives us a second option:

A version or timestamp property should never be null for a detached instance, so Hibernate will detact any instance with a null version or timestamp as transient, no matter what other unsaved-value strategies are specified. Declaring a nullable version or timestamp property is an easy way to avoid any problems with transitive reattachment in Hibernate, especially useful for people using assigned identifiers or composite keys!

So if you leave your version column empty, NHibernate will always detect the object as new.

Friday, December 10, 2010

Impress your colleagues with your knowledge about…the volatile keyword

Sometimes when working with C# you discover some hidden gems. Some of them very useful, other ones a little bit harder to find a good way to benefit from their functionality. One of those hidden gems that I discovered some time ago is the volatile keyword.

The volatile keyword indicates that a field might be modified by multiple threads that are executing at the same time. Fields that are declared volatile are not subject to compiler optimizations that assume access by a single thread. This ensures that the most up-to-date value is present in the field at all times.

The volatile modifier is usually used for a field that is accessed by multiple threads without using the lock statement to serialize access.

The following example demonstrates how an auxiliary or worker thread can be created and used to perform processing in parallel with that of the primary thread.


   1:  using System;
   2:  using System.Threading;
   4:  public class Worker
   5:  {
   6:      // This method is called when the thread is started.
   7:      public void DoWork()
   8:      {
   9:          while (!_shouldStop)
  10:          {
  11:              Console.WriteLine("Worker thread: working...");
  12:          }
  13:          Console.WriteLine("Worker thread: terminating gracefully.");
  14:      }
  15:      public void RequestStop()
  16:      {
  17:          _shouldStop = true;
  18:      }
  19:      // Keyword volatile is used as a hint to the compiler that this data
  20:      // member is accessed by multiple threads.
  21:      private volatile bool _shouldStop;
  22:  }
  24:  public class WorkerThreadExample
  25:  {
  26:      static void Main()
  27:      {
  28:          // Create the worker thread object. This does not start the thread.
  29:          Worker workerObject = new Worker();
  30:          Thread workerThread = new Thread(workerObject.DoWork);
  32:          // Start the worker thread.
  33:          workerThread.Start();
  34:          Console.WriteLine("Main thread: starting worker thread...");
  36:          // Loop until the worker thread activates.
  37:          while (!workerThread.IsAlive) ;
  39:          // Put the main thread to sleep for 1 millisecond to
  40:          // allow the worker thread to do some work.
  41:          Thread.Sleep(1);
  43:          // Request that the worker thread stop itself.
  44:          workerObject.RequestStop();
  46:          // Use the Thread.Join method to block the current thread 
  47:          // until the object's thread terminates.
  48:          workerThread.Join();
  49:          Console.WriteLine("Main thread: worker thread has terminated.");
  50:      }
  51:      // Sample output:
  52:      // Main thread: starting worker thread...
  53:      // Worker thread: working...
  54:      // Worker thread: working...
  55:      // Worker thread: working...
  56:      // Worker thread: working...
  57:      // Worker thread: working...
  58:      // Worker thread: working...
  59:      // Worker thread: terminating gracefully.
  60:      // Main thread: worker thread has terminated.
  61:  }

For more information:

Thursday, December 9, 2010

Software can not be manufactured

As developers we all agree with the title of this post. Still, a lot of desperate managers and business owners keep pretending that software development is a manufacturing process at heart.


Requirements specifications are created by analysts, architects turn these specifications into a high-level technical vision. Designers fill out the architecture with detailed design documentation, which is handed to robot-like coders, who sleepily type in the design’s implementation. Finally, the quality inspector  receives the completed code, which doesn’t receive her stamp of approval unless it meets the original specifications. This sounds an awful lot like the typical waterfall methodology if you ask me!

It is no wonder that managers want software development to be like manufacturing. Managers understand how to make manufacturing work, we all do. We have decades of experience in how to build physical objects efficiently and accurately. So, applying what we’ve learned from manufacturing, we should be able to optimize the software development process into the well-tuned engine that our manufacturing plants have become.

Unfortunately, the manufacturing analogy doesn’t work. Things change in business, and businesspeople know that software is soft and can be changed to meet those changing requirements. This means architecture, designs, code, and tests must all be created and revised in a fashion more agile than the leanest manufacturing processes can provide. And their we have the magic word, “Agile”. In today’s rapidly changing environment, flexilibity is key and is only achievable through agile processes.

What do you think?

Wednesday, December 8, 2010

Deleting Team Foundation Server Team Projects

If you want to remove a team project from Team Foundation Server when the project is no longer required, you can use the TFSDeleteProject command line tool. This tool can also be used if there are components that remain undeleted after an unsuccessful team project creation.

TFSDeleteproject [/q] [/force] [/excludewss] /collection:URL TeamProjectName

You can find the TFSDeleteProject command-line tool in Drive:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE on any client computer that runs Team Explorer.

Remark: TFSDeleteProject permanently destroys the team project, after which it cannot be recovered. You should backup all important project data before using TFSDeleteProject.

Tuesday, December 7, 2010

31 days of Windows Phone

Interested in the Windows Phone 7? But you don’t know where to start?

First download the free Windows Phone 7 developer tools. The following is installed with the download:

  • Visual Studio 2010 Express for Windows Phone – Free edition of VS 2010 for Phone development.
  • Express Blend 4 for Windows Phone – Free version of Blend for Windows Phone 7 Development.
  • Silverlight for Windows Phone 7 – Rich framework for building great applications for Windows Phone 7.
  • XNA Game Studio for Windows Phone 7 Rich framework that enables you to build great 2D and 3D games for Windows Phone 7.
  • Windows Phone Emulator – A hardware accelerated emulator that allows you to run and debug your applications and games without requiring a phone.
  • Phone Registration Tool – When you get a device, this allows you to “unlock” the device so you can run/debug your application on it, using your Marketplace account.

Afterwards read the 31 Days of Windows Phone 7 blog series by Jeff Blankenburgs. And then it’s all up to your creativity!


Monday, December 6, 2010

NHibernate 3.0 GA released

nhibernate The GA(General Availability=Final) version of NHibernate 3.0 got released yesterday. Go get it!

Most important improvements are the ability to use lambda expressions and a full-blown LINQ provider. Plans for version 3.1 include additional bug fixes and patches, as well as enhancements for the new LINQ provider.

Free e-books for .NET programmers

“There is not such a thing as a free lunch.”

Most of the time this is true, but sometimes you find a lot of information for free!

I noticed this blog post by Anoop Madhusdanan where he mentions 7 freely available e-books for .NET programmers and architects.

If you need a (cheap) gift under the Christmas tree, you’ve find it :-)


Sunday, December 5, 2010

Visual Studio 2010 Extensions: Colored Console Application Template

With the release of Visual Studio 2010, creating and finding extensions became a lot easier thanks to the build in Extension Manager.

One of the great extensions I discovered is the Colored Console Application Template.


Creating a Console application in Visual Studio was always easy. But the standard Console Application project is rather … empty. The moment you needed a mature console application, you probably started adding features like:

  • Functional style command line parser
  • Console Coloring
  • Help

With the Colored Console Application Template you no longer have to add these features yourself. You’ll get them out of the box for free including a lot of other features.


You can download the template directly from Visual Studio through the Extension Manager or go here and install it yourself.

Saturday, December 4, 2010

Error adding test case, there is no test with specified id.

Last week I was testing the Coded UI Test feature of Visual Studio 2010. But when I ran the test in an lab management virtual environment, the test returned an error. In the error log I found the following error message:

Error adding test case [xx] to test run: There is no test with specified Id {xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}.

I found out that the error was caused by the fact that I was running the test against a build that didn’t contain the coded UI Test associated with the Test case.  The problem was that I scheduled a new test run in the Microsoft Test manager without changing the build.

To solve the error I queued a build, verified that the build succeeded, then changed the build associated with the test plan. So be sure that your test plan is always up to date with the correct build.

Friday, December 3, 2010

Visual Studio 2010 Feature Pack 2

Visual Studio 2010 Feature Pack 2 is now available to MSDN subscribers. This introduces some new testing features inside Visual Studio 2010:

  • Use Microsoft Test Manager to capture and playback action recordings for Silverlight 4 applications.
  • Create coded UI tests for Silverlight 4 applications with Visual Studio 2010 Premium or Visual Studio 2010 Ultimate..
  • Edit coded UI tests using a graphical editor with Visual Studio 2010 Premium or Visual Studio 2010 Ultimate.
  • Use action recordings to fast forward through manual tests that need to support Mozilla Firefox 3.5 and 3.6.
  • Run coded UI tests for web applications using Mozilla Firefox 3.5 and 3.6 with Microsoft Visual Studio 2010 Premium or Visual Studio 2010 Ultimate.

For more details, see

Microsoft also created some videos on the new Testing features in Visual Studio 2010 Feature Pack 2:

Thursday, December 2, 2010

Check-in Policy override feature

In Team Foundation Server, you have the concept of check-in policies. This feature allows you to define a set of checks that have to succeed before a developer can check-in it’s code. But a developer can always override these check-in policies and still do a check-in.

A lot of customers ask if they can block the user from overriding the policy. To state it clear: you cannot disable this feature. However you can get alerts when someone overrides the policies:

  • Click Team --> Alerts Explorer.
  • Add a CheckinEvent.
  • Set Alert Definition to PolicyOverrideComment<>''

Wednesday, December 1, 2010

Debugging SQL with SQL Management Studio

I think we all agree that testing stored procedures and functions on a database tier can be time consuming, they are hard to debug and sometimes just difficult to get clarity on what is “happening”.  To help you understand what’s going on, you can use the built-in debugging features of Microsoft SQL Management Studio. These allow you to see exactly what is going on, and step through your logic in a similar fashion as in Visual Studio.

To get you going check this post(Debugging SQL Queries, Functions, & Stored Procedures with SQL Management Studio’s Integrated Debugger) by Doug Rathbone.