Wednesday, June 30, 2010

Testing NULL values with Powershell

When a developer has to do an IT pro job, he does what he knows best: writing code. So I started with writing some powershell scripts to automate some administrator jobs.

One thing you may not forget is that Powershell is a lot more friendly for NULL values than C#, so don’t forget to check your objects for NULL values. In Powershell this is very clean and easy to do.

To see if a variable is null, simply check:

   1:  If (!$Variable) {some action}
Conversely, to verify if the variable has any value:
 
   1:  If ($Variable) {some action}

IIS Express: Combining the best of ASP.NET development Server and IIS

As a .NET developer I normally run and debug my ASP.NET (MVC) sites and WCF services using one of two web-servers:

  • The ASP.NET Development Server that comes built-into Visual Studio
  • The IIS Web Server that comes built-into Windows

Both of the above options have their pros and cons, the ASP.NET Development Server is very easy to use, where IIS has a lot more power and features.  Today Scott Guthrie announced a new, free option – IIS Express - that combines the best characteristics of both.

IIS Express will work with VS 2010 and Visual Web Developer 2010 Express, will run on Windows XP and higher systems, does not require an administrator account, and does not require any code changes to use.  You will be able to take advantage of it with all types of ASP.NET applications, and it enables you to develop using a full IIS 7.x feature-set.

Read more about it in Scott’s post: http://weblogs.asp.net/scottgu/archive/2010/06/28/introducing-iis-express.aspx

SQL Server: Showing a list of open database connections

Today I wanted to drop a specific database, but their was one account still connected. I had no idea who was still using the database, so I needed a way to get the list of open database connections.

There are 2 commands that helped me out here:

  • sp_who
  • sp_who2

The sp_who internal procedure allows users to view current activity on the database. This command provides a view into several system tables (e.g., syslocks, sysprocesses, etc.). The sp_who command returns the following information:

  • Spid—The system process ID.
  • status—The status of the process (e.g., RUNNABLE, SLEEPING).
  • loginame—Login name of the user.
  • hostname—Machine name of the user.
  • blk—If the process is getting blocked, this value is the SPID of the blocking process.
  • dbname—Name of database the process is using.
  • Cmd—The command currently being executed (e.g., SELECT, INSERT)
The sp_who2 internal procedure provides the above information, but also provides the following additional information:
  • CPUTime—Total CPU time the process has taken.
  • DiskIO—Total amount of disk reads for the process.
  • LastBatch—Last time a client called a procedure or executed a query.
  • ProgramName—Application that has initiated the connection (e.g., Visual Basic, MS SQL Query Analyzer)

Nice quote for Team leads and Scrum Masters

“Never tell people how to do things. Tell them what to do and they will surprise you with their ingenuity.”

George S. Patton

Community Day session: Building an enterprise application with Silverlight and NHibernate

At Community Day 2010, together with Gill Cleeren, I gave a session called Building an enterprise application with Silverlight and NHibernate. We not only showed Silverlight and NHibernate but also looked into topics like CQRS(Command-Query Responsibility Seggregation), OData, MVVM, and so on.

Our demo application had a lot more features than we could show, so certainly have a look at the code.

Friday, June 25, 2010

OData: Using the reflection based provider

If you start implementing WCF Data Services, sooner or later you will certainly see following error when calling the OData feed:

“On data context type 'XXXXX’, there is a top IQueryable property 'XXXX' whose element type is not an entity type. Make sure that the IQueryable property is of entity type or specify the IgnoreProperties attribute on the data context type to ignore this property.”

This can mean two things. Either you have a property that cannot be processed by the Entity Data Model used by OData or OData could not find a primary key for your class.

OData requires a primary key for every resource. If you use the reflection based provider, it uses one of the following mechanisms to find the primary key:

  • via the [DataServiceKey] attribute
  • by looking for a "FooID" property, where the type is called "Foo"
  • by looking for an "ID" property

Important to know is that this convention is case-sensitive: "FooId" won't work without a [DataServiceKey].

Integrating NHibernate and WCF by a little bit of IoC

Looking for some information how to optimize the NHibernate integration in your WCF services?

Have a  look at the following blog series:

Model based frameworks are the future

As a frequent user of a lot of open-source frameworks, I see an interesting architectural trend emerging. More and more frameworks are build on top of an internal model. Model-based frameworks separate the configuration-time activities from the run-time activities.  This adds some complexity for the framework designer, but opens up many opportunities for some really cool features and possibilities.

Separate configuration and runtime

The framework is designed in such a way that the configuration model and the API are completely separated. The configuration API becomes just another consumer and manipulator of the configuration model. With a model based approach, the framework allows us to manipulate  and extend the framework on top of the model. This gives me, as a consumer of the framework, to get the maximum out of these frameworks. Hopefully this trend will also be adopted by our friends at Microsoft…

If you want to learn more about this, read Chad Myers blog post.

Downloading files using ASP.NET MVC

For a customer project in ASP.NET MVC 2, we added some functionality to download files. This can easily be done in ASP.NET MVC with the FileStreamResult class. Our original code looked something like this:

   1:  private ActionResult CreateFileStreamResult(string filepath, string fileResultName, string contentType)
   2:  {
   3:              FileStreamResult result = new FileStreamResult(new FileStream(filepath, FileMode.Open), contentType);
   4:              result.FileDownloadName = fileResultName;
   5:              return result;
   6:   }

Not long after deployment in our test enviroment, we discovered that it didn’t work. Instead we got back the following exception:

System.UnauthorizedAccessException: Access to the path is denied 

We double-checked the application pool, but the user account we used had the necessary read privileges on the target folder. After opening the FileStream class in Reflector, I noticed the cause of our problem. If you are using the constructor with FilePath and FileMode alone. System.IO.FileStream implementation of this constructor is

   1:  public FileStream(string path, FileMode mode)
   2:   : this(path, mode, (mode == FileMode.Append) ? FileAccess.Write : FileAccess.ReadWrite, FileShare.Read, 0×1000, FileOptions.None, Path.GetFileName(path), false)
   3:  {
   4:  }

The default constructor asks for ReadWrite access! That’s why a System.UnauthorizedAccessException is thrown in our test environment. The user account under which our application runs has no write access.

As our application needs only Read access to the folder,  we change the code and use the following FileStream constructor:

   1:  using (FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))

It would probably have been better if  Read access was the default in System.IO.FileStream rather than ReadWrite.

My WPF Toolbox

Although the Visual Studio WPF Designer improved a lot in Visual Studio 2010, there are still some external tools that I use in my day to day WPF development.

Snoop

Snoop is a standalone WPF inspector that can attach to the WPF application of your choice and let you explore its visual content. It provides a couple of very handy features for identifying elements in your application. But it also allows you to do a lot more, including:

  • Identifying recently changed properties.
  • Edit writable properties.
  • Identify unset properties.
  • Delve into property bindings.
  • Preview and zoom a specific element.
  • 3D zoom to show the visual composition of an element.

http://www.blois.us/Snoop/

Kaxaml

Kaxaml is a lightweight XAML editor that gives you a "split view" so you can see both your XAML and your rendered content (kind of like XamlPad but without the gigabyte of SDK).

http://www.kaxaml.com/

XAML Power Toys 

XAML Power Toys are a Visual Studio 2008 SP1 Add-In or a Visual Studio 2010 Add-In that empowers WPF & Silverlight developers while working in the XAML editor.  Its Line of Business form generation tools, Grid tools, DataForm, DataGrid and ListView generation really shorten the XAML form layout time.

http://karlshifflett.wordpress.com/xaml-power-toys/)

Do you know some other useful tools?

Bad Code, Craftsmanship, Engineering, and Certification

It’s always a pleasure to see Robert C. Martin in action. So don’t miss his keynote at QCon London 2010, were he tried to figure out why there is so much bad code written. He offers advice on writing good code talking about a bad code example, Boy Scout rule, functions, arguments, craftsmanship, TDD, continuous integration, pairing, small cycles, patterns, engineering, certification, and other elements contributing to qualitative code.

A must see!

TFS 2010 Status Reports

As a TFS administrator, knowing what’s going on inside your server is crucial.  The TFS Administration Console is your first help in monitoring the server status. However, it doesn’t show you anything about the queued & executing jobs. 

Luckily for us, the TFS team released two reports to give a better insight as to what is happening on the server. To get all the details go to Grant Holliday’s blog and follow his instructions.

After installation you got 2 reports:

Warehouse Status Report

The first part of the report shows you the overall status, similar to the ‘Reporting’ tab in the Team Foundation Administration Console. This is a quick an easy way to find out if an Incremental or Full processing is in progress.  It will also show you any errors (like warehouse schema conflicts) in the ‘Last Run’ column.

The second part of this report is useful after an upgrade or when the warehouse needs to be rebuilt manually.  It shows you each of the data adapter sync jobs for each collection and their current status. During normal operation, these will run very quickly as data changes in the operational stores, so you’ll probably always see them as “Idle”. It will also show you any errors from previous job executions in the ‘Last Run’ column.

Job Status Report

The first part of this report shows you the job definitions for the instance and the interval they’re set to run on. This is useful for checking to see if a job has somehow been disabled or changed.

The second part of this report shows you the job history.

Friday, June 11, 2010

Uploading file streams over HTTP using WCF

Although WCF is really great, for some scenario’s it can be really hard to get everything up and running correctly. One of these scenario’s is uploading files.

Some important considerations:

Choose wisely between the streamed and buffered option

You have two main options for uploading files in WCF:

  • streamed
  • buffered/chunked

If you want to have reliable data transfer you’ll have to use the buffered/chunked option. Reliable messaging cannot be used with streaming as the WS-RM mechanism requires processing the data as a unity to apply checksums, etc. If you do not need the robustness of reliable messaging, streaming lets you transfer large amount of data using small message buffers without the overhead of implementing chuncking.

Secure your streams

Streaming over HTTP requires you to use basicHttpBinding; thus you will need SSL to encrypt the transferred data. Buffered transfer, on the other hand, can use wsHttpBinding, which by default provides integrity and confidentiality for your messages; thus there is no need for SSL then.

Use message headers for meta-data

   1:  [OperationContract(Action = "UploadFile", IsOneWay = true)] 
   2:  void UploadFile(UploadMessage request); 
   3:   
   4:  [MessageContract] 
   5:  public class UploadMessage 
   6:  { 
   7:     [MessageHeader]
   8:     public string MetaData{get;set;}  
   9:     [MessageBodyMember]
  10:     public System.IO.Stream File{get;set;}
  11:  } 


WCF requires that the stream object is the only item in the message body for a streamed operation. Therefore headers are the recommended way for sending meta-data when streaming.


Configure the transferMode on the client


The setting for transferMode does not propagate to clients when using a HTTP binding. You must manually edit the client config file to set transferMode = "Streamed" after using 'Add service reference' or running SVCUTIL.EXE.


ASP.NET Developer Server limitations


If you try to run your streaming webservice, expect a big failure. Cassini support neither streamed transfers/MTOM encoding neither SSL.

So configure your service to be hosted in IIS.


Uploading large data streams over HTTP


WCF streaming is not only limited by the maxReceivedMessageSize settingb but also by IIS/ASP.NET who has a limit on the size of a HTTP request to prevent denial-of-service attacks. So for WCF streaming in IIS to work properly, you need to update your web.config and add following line:


   1:  <!-- maxRequestLength (in KB) max size: 2048MB --> 
   2:  <httpRuntime maxRequestLength="2000000" /> 

After that you are finally streaming large amounts of data over HTTPS.

Wednesday, June 9, 2010

Podcast NHibernate stump the expert

If you want to know everything about NHibernate, here is your change to get 4 hours(!) of content with one of the creators of NHibernate, Ayende.

One nice quote: “There are no problems with NHibernate, there are only problems with people using NHibernate”.  Enough to set the tone…

Watch the podcast here: http://skillsmatter.com/podcast/open-source-dot-net/nhibernate-tutorial-by-ayende-rahien

Logging the StackTrace

As I was implementing some logging into an application, I was thinking that it would be nice to include the stacktrace in the log information. I know that you can easily get this information in case of an error, but I did not know if you could do the same thing when you just want to log something without any error.

As I found out this is possible thanks to the  StackTrace class. You can get the frames using StackTrace.Get­Frames method.

   1:  using System.Diagnostics;
   2:   
   3:  [STAThread]
   4:  public static void Main()
   5:  {
   6:    StackTrace stackTrace = new StackTrace();           // get call stack
   7:    StackFrame[] stackFrames = stackTrace.GetFrames();  // get method calls (frames)
   8:   
   9:    // write call stack method names
  10:    foreach (StackFrame stackFrame in stackFrames)
  11:    {
  12:      Console.WriteLine(stackFrame.GetMethod().Name);   // write method name
  13:    }
  14:  }

The directive or the configuration settings profile must specify the ‘varyByParam’ attribute.

When I was trying to implement caching in an ASP.NET MVC 2 web application, I got the following error after adding the [OutputCache] attribute.

The directive or the configuration settings profile must specify the ‘varyByParam’ attribute.

As I applied the OutputCache attribute on an ActionResult method with no input parameters , I thought that the VaryByParams would be of no use in this case. But based on the error it seems that I have to specify this parameter.

So the next question is, what should I enter for a value if I have no parameters to cache against? As I found out here I have to add the magic string “none” in case you don’t have any parameters.

Adding reports and dashboards to upgraded Team Projects

So you are done upgrading your Team Foundation Server from 2005 or 2008 to 2010. Now what?

Maybe it’s a good idea to start provisioning your upgraded team projects by using the new dashboards and reports that are provided with version 5.0 of the Microsoft Solutions Framework (MSF) process templates. You can also add dashboards or reports to an existing team project when resources for SharePoint Products or SQL Server Reporting Services become available.

To provision your upgraded team project , you can create a batch file and run it from the Visual Studio command window. The batch file command provisions features in the same way that the New Team Project Wizard provisions features. When you run the batch file to add dashboards and Excel reports, the command uploads all documents that are provided with the process template that you specify and provisions the dashboards and Excel reports. The command also activates the SharePoint site and dashboard features that are specified in the process template.

Of course you don’t have to create this batch file yourself,  on the MSDN site a very good description is available about how to configure and run this batch script:

Introducing the Linfu Framework

As I was looking for some dynamic proxy implementation (and I did not want to write it myself) , I stumbled on LinFu.

LinFu is a lightweight framework that extends the .NET framework with some really useful functionality. It supports the following features:

  • Dynamic Proxies

  • Duck Typing and Late Binding

  • Ruby-style Mixins

  • Delegates with Lambda Arguments (aka, "Currying")

  • Universal Event Handling

  • A Very, Very Simple IoC Container

  • Design by Contract

If you are interested in learning about the different features and how you can use them, certainly have a look at the following blog series:

A Microsoft Scrum template for TFS

Microsoft has always delivered two process templates with TFS: MSF Agile and MSF for CMMI.  Next to this there have always been a lot of third party process templates available.

With TFS 2010 a new MSF Agile template was released. As this template felt very ‘Scrum-ish’ but wasn’t really Scrum, teams trying to Scrum using the MSF Agile template had a hard time to match the concepts and principles to the template.   For example, it uses different terminology like Iteration rather than Sprint, User Story rather than “Product Backlog item”, etc.

This week Microsoft released a beta version of a TFS process template specifically optimized for Scrum projects – Team Foundation Server Scrum v1.0 Beta

So if you want to do Scrum(or are already doing it) and you are using TFS, certainly check out this template. I didn’t try it myself, that will be something for the upcoming weeks. I’ll post more about it later.

And don’t forget that there is still the excellent Scrum for Team System out there.

Tuesday, June 8, 2010

Visual Studio 2010 Extensions

I guess the Microsoft product teams had some time left as new Visual Studio extensions are released:

One feature I like a lot is the ability to create code based on your UML diagrams.

GenerateCodeFromDiagram

NHibernate 2 Beginners Guide

With the increasing interest in ORM in the .NET world, it would be a shame if you only looked at Linq to SQL and Entity Framework. With the release of  a new book on NHibernate: NHibernate 2 Beginner's Guide, by Aaron Cure, you' have no excuse to not try the richness and power of NHibernate.

If your are interested, there's a free chapter you can download, you can get it from here.