# Wednesday, August 27, 2008

Microsoft recently released the ASP.Net Model View Controller framework (MVC).  It is currently available as Preview 3 and can be downloaded at http://www.microsoft.com/downloads/details.aspx?FamilyId=92F2A8F0-9243-4697-8F9A-FCF6BC9F66AB&displaylang=en

A new MVC project contains a couple sample views and controllers so you can get an idea of the proper syntax to use. 

This article builds on the application created in my last ASP.Net MVC tutorial.  If you have not already done so, please follow the brief states in the previous MVC tutorial before beginning this tutorial

In the last tutorial, we added a model, view and controller to display a list of customers that one can navigate to using a URL formatted as controller/action.  In this article, we will add a new view and controller to an existing MVC project and display details of a single customer using a URL formatted as controller/action/id.  The id is passed automatically to the action method and allows us to filter to a single customer.

1.       Open Visual Studio 2008 and open the TestMVC solution created in MVC Tutorial 2.

2.       Open the Solution Explorer (View | Solution Explorer) and select the Controllers\CustomerController.cs.   Double-click CustomerController to open it in the code editor.

a.       In the CustomerController class, we will create a new action to get the details of a single customer.   

                                                               i.      Add the following private GetCustomer method to the CustomerController class.  In the last tutorial, we wrote methods to retrieve customer 1 and customer 2.  For simplicity, we will get Customer 1 if ID 1 is passed in to our method and Customer 2 if any other ID is passed.

        private Customer GetCustomer(int custID)

        {

            if (custID == 1)

            {

                return GetCustomer1();

            }

            else

            {

                return GetCustomer2();

            }

        }

                                                             ii.      Add an Action method to the CustomerController class to get the details of a customer.  Paste the following code into CustomerController.cs.

        public ActionResult Details(int id)

        {

            Customer cust = GetCustomer(id);

            return View("Details", cust);

        }

                                                            iii.      In the GetCustomer method, we get details of a single customer and return a view.  Unlike the generated code, we explicitly specify which view to return (“Details”) and we pass in some extra data (cust) that the view will consume.

3.       Add a view to the project.

a.       In the Solution Explorer, right-click the Views\Customer folder and select Add | New Item.  The Add New Item dialog displays.

    Figure 1

                                                               i.      Under Categories, select Visual C#\Web\MVC.

                                                             ii.      Under Templates, select  MVC View Content Page.

                                                            iii.      In the Name textbox, enter “Details”.

                                                           iv.      The Select a Master Page dialog displays.  

   Figure 2

1.       Navigate to the Views\Shared folder and select Site.Master.

2.       Click the OK button to add this view content page to the project.

b.      Add visual elements to the View

                                                               i.      If it is not already open, open the Details view by double-clicking List.aspx in the Solution Explorer.  Click the Source tab at the bottom of the editor.

                                                             ii.      Replace the code in Details.aspx with the following

<%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" AutoEventWireup="true" CodeBehind="Details.aspx.cs" Inherits="TestMVC.Views.Customers.Details" %>

<%@ Import Namespace="TestMVC.Views.Customers"%>

<%@ Import Namespace="TestMVC.Models" %>

<asp:Content ID="Content1" ContentPlaceHolderID="MainContent" runat="server">

    <div>

        CustID:

        <%=((Customer)ViewData.Model).CustID %><br />

        Name:

        <%=((Customer)ViewData.Model).FirstName %>

        <%=((Customer)ViewData.Model).LastName %>

        <br />

        Address:

        <%=((Customer)ViewData.Model).StreetAddress%><br />

        City:

        <%=((Customer)ViewData.Model).City %><br />

        State:

        <%=((Customer)ViewData.Model).State %><br />

        ZIP:

        <%=((Customer)ViewData.Model).PostalCode%><br />

    </div>

 

</asp:Content>


The above code displays details of a single Customer model object. 

4.       Modify the List view

a.       Open Views\Customers\List.aspx by double-clicking it in Solution Explorer.

b.      Find the <td>&nbsp;<\td> cell tag following the cust.PostalCode cell.  Replace this with the following code.  This generates a hyperlink displaying the text “Details”.  The URL of the hyperlink will have the current controller (“Customer”) the “Details” Action and the ID of the current customer.

<td>

<%=Html.ActionLink("Details", "Details", new { ID=cust.CustID})%>

</td>

5.       Test the application

a.       Save and compile the solution (Build | Build Solution).   Correct any errors you find.

b.  Run the solution (Debug | Start Debugging).  Depending on the port number used by Casini, it should display in your browser with a URL such as
http://localhost:4152/Home

c.       Navigate to the List page by changing the URL to
http://localhost:4152/Customers/List
(Replace the port number if necessary)

d.      You should see a list of 2 customers in your browser.  Each customer should have a hyperlink labeled “Details”. 

    Figure 3

e.      Click the hyperlink next to customer 1.  The URL should change to
http://localhost:4152/Customers/Details/1
and the details of the first customer should display on screen.

    Figure 4

f.        Set a breakpoint in the Details method of CustomerController.cs and refresh the page to step through the code as it executes.

In this article, we added a view and controller to our application and used these to retrieve and display customer details by the Customer ID. 

.Net | ASP.NET | MVC
Wednesday, August 27, 2008 1:26:29 PM (GMT Daylight Time, UTC+01:00)
# Tuesday, August 26, 2008

There's nothing quite like riding on a bus for 27 hours with a few dozen software developers.

That's exactly what I did last weekend on my way to and from DevLink in Murfreesboro, TN.

The DevLink bus was the result of much hard work by Amanda who did most - if not all - the organizing.

The bus began its odyssey Wednesday night in Grand Rapids, MI before proceeding to Lansing.  When the bus picked me up in Plymouth, MI at 9:15 PM, it had already been traveling for 3 hours.  From there, it rolled on to Toledo, Columbus and Cincinnati (its final southbound stop at 4AM) before proceeding to its final destination in Murfreesboro, Tennessee. 

The first half of the ride was great fun.  I met new people; I reconnected with old friends; and I met people face-to-face with whom I had only communicated electronically.

Event the two times the bus driver got lost couldn't dampen our spirits.  We were having too much fun.

After about 4AM, the trip began to drag.  Everyone was exhausted and the seats were too uncomfortable to allow more than a few minutes sleep at a time.  This was clearly a bus designed for cross-town trips – not cross-country trips.  DVDs provided some entertainment ("The Big Lebowski" and “Office Space” cracked me up) but you can only focus on movies for so long and the acoustics were less than ideal.

We arrived at our hotel worn and weary at 9:30AM Central time, over 13 hours after leaving Plymouth and 16 hours after the bus began its trip. 

24+ hours of sleep deprivation left me physically ill and I spent nearly all of Thursday in bed.

It was good that I did.  The conference began Friday morning and I awoke refreshed and ready to absorb and exchange ideas at what turned out to be an excellent conference.  (You can read more about it here.)

The ride home was an adventure.  Although the conference ended at 6PM, someone decided the bus shouldn't leave town until 9PM so we had to kill a few hours at a restaurant before heading out. 

Although we picked up at least one new traveler on the ride home, we lost a few more.  Some folks booked a flight to Chicago for a Monday business meeting and a couple people elected to rent a car and drive home rather than subject themselves to the length and discomfort of the bus ride.  I was tempted to join them but I stuck with my plan and boarded the bus at 9.

The mood was more subdued on the return trip. We still had some good conversations early in the ride but the environment lacked the energy sparked by seeing people for the first time in months.

About 2AM, I had just begun to drift to sleep when a sound like a jackhammer awakened me suddenly.  I was sitting just above the tire that blew flat traveling 65 miles an hour on I-71 just north of Louisville, KY.  We pulled off the highway in front of a Waffle House in Carrolton, KY and waited 3 hours for the tire to be fixed.  The repair would have taken far less time if the repair guy had not allowed a drunken Waffle House customer to play with the lug nut gun.  He sheared off the bolts on wheel, forcing the repair guy to return to his shop for more parts and weld on new bolts.

Waffle House provided some entertainment.  Corey attempted to start a relationship with one of the waitresses; and the manager tossed out a drunk who wouldn't stop ordering pork chops long after he was told the restaurant was out of pork chops.  Apparently the Carrolton Waffle House is a magnet for drunks at 2AM Saturday night.  Go figure.

We got back on the highway about 5AM so exhausted that even the uncomfortable seats wouldn't prevent a few hours dozing.

I thought things would go quicker after this because the drivers knew the route better, having just driven it 3 days earlier.  Alas, we became lost in northeast Ohio and drove nearly to Cleveland before turning west toward Toledo.

At 12:30 Eastern time - 14.5 hours after leaving the hotel and 17.5 hours after the end of the conference - we pulled into the parking lot in Plymouth.  I was never so happy to see my car and the 40 minute drive home seemed trivial. 

I was scheduled to read aloud at the 12:15 mass in my church and the last minute substitute is probably still angry with me for missing this, but we all arrived safely.

Was the bus ride a success?  Would I do it again?  Would I take a bus to a distant location with dozens of others like me?  I've decide that, if I can be promised a more comfortable seat, I will do it the next time it's offered.  Everything else was easily tolerable and I did enjoy the fellowship that came with such a long ride with like-minded people.  I even got a few job leads from the conversations I had on the two rides (I am between jobs for those who don't know).

If you go on such a trip, my only advice is

  1. Be prepared for anything
  2. If the Waffle House waitress in Carrolton, Ky tells you they are out of pork chops, shut up and order the hash browns. 

Note: Click here to see more photos from DevLink and the famous bus ride.

Tuesday, August 26, 2008 4:38:58 PM (GMT Daylight Time, UTC+01:00)

People attend conferences for many different reasons.  Some come for the content of the lectures; some come to meet and hear well-known speakers; some come to meet and network with others in the industry; some come to see old friends.

Me, I come for all those reasons.  At DevLink last week in Murfreesboro, TN, I experienced all those things and more. 

But I also experienced something new.  I had heard of Open Spaces in the past but had not experienced them.  At DevLink, Open Spaces were promoted heavily as a different way of exchanging ideas.  I was curious and gave it a try.

An Open Space event consists of developers sitting together roughly in a circle in a room and they exchange ideas with one other.  A topic is picked in advance by the group but the conversation is not limited to that topic.  If the conversation drifts from the assigned topic and the group remains engaged, this is perfectly all right.  The important thing is that ideas are exchanged and the group remains passionate about the conversation.

And I heard a great deal of passion at the DevLink Open Spaces that I attended.

During the event, I attended 3 Open Spaces sessions plus the planning session (where topics were picked) and the wrap-up session (where the group reviewed the open spaces of the previous 2 days).  In each session I attended, I heard bright people sharing great ideas.  Sometimes we argued and sometimes we were in violent agreement but I enjoyed it all. 
In a session on Service Oriented Architecture, I argued earnestly that, due to the costs of SOA, support from the top was necessary for SOA to succeed within any organization.  Most of the other loud persons in the group insisted that newer tools such as WCF had lowered the cost of SOA sufficiently that a strong grass roots effort could drive SOA in an organization.  By the end of the session, I think we had all learned something and moved a little toward understanding the others' side.

I did attend a few traditional sessions in which a speaker stands in front of a classroom and delivers a lecture to an audience that is mostly passive.  Richard Campbell and Carl Franklin were two of the speakers at this conference and I have long been a fan of their .Net Rocks podcast, so I made a point to attend a lecture by each of them.  Both were good sessions but they were easily topped by Joe Wirtley who gave an excellent talk on WPF.  It was excellent because it focused on building a business application, rather than the eye candy that clutters so many WPF presentations.

Overall the conference was a great success.  It drained me of energy but it fired me up at the same time.

And I haven't even told you about the 28 hours I spent riding a bus with a few dozen techno-geeks.  Or the flat tire that left us stranded in Carrolton, KY for 3 hours at 2AM.  But that’s another story.

Note: Click here to see more photos from DevLink

Tuesday, August 26, 2008 4:49:27 AM (GMT Daylight Time, UTC+01:00)
# Friday, August 22, 2008

Microsoft recently released the ASP.Net Model View Controller framework (MVC).  It is currently available as Preview 3 and can be downloaded at http://www.microsoft.com/downloads/details.aspx?FamilyId=92F2A8F0-9243-4697-8F9A-FCF6BC9F66AB&displaylang=en

A new MVC project contains a couple sample views and controllers so you can get an idea of the proper syntax to use. 

In this article, we will add a new model, view and controller to an existing MVC project. 

1.       Open Visual Studio 2008 and create a new MVC project.  For information on how to create a new MVC project, see http://www.davidgiard.com/2008/08/18/TheASPNetMVCSampleAppDemystified.aspx

2.       Open the Solution Explorer (View | Solution Explorer) and select the Models folder.

3.       Add a Model to the project

a.       Right-click the Models folder and select Add | Class.  The Add New Item dialog displays.
Figure 1
    Figure 1

                                                               i.      At the Name textbox, enter “Customer”.

                                                             ii.      Click the OK button to create a Customer class.

b.      The Customer class opens in the class editor.  This class will contain a few public properties that help describe a customer object.  Add the following code to the Customer class.

    public class Customer

    {

        public int CustID { get; set; }

        public string FirstName { get; set; }

        public string LastName { get; set; }

        public string StreetAddress { get; set; }

        public string City { get; set; }

        public string State { get; set; }

        public string PostalCode { get; set; }

    }

4.       Add a Controller to the project

a.       In the Solution Explorer, right-click the Controllers folder and select Add | New Item.  The Add New Item dialog displays.
Figure 2
    Figure 2

                                                               i.      Under Categories, select Visual C#\Web\MVC.

                                                             ii.      Under Templates, select MVC Controller Class.

                                                            iii.      In the Name textbox, enter “CustomerController”.

                                                           iv.      Click the Add button to create the CustomerController class and open it in the class editor.

b.      In the CustomerController class, we will create some actions.  Each action will instantiate one or more Model objects and display them in a view object. 

                                                               i.      Add the following statement at the top of the CustomerController class. 

using TestMVC.Models;

                                                             ii.      Add the following private methods to the CustomerController class.  For now, we will create customers out of thin air (as if it were that easy).  In a real application, we would probably call a web server or query a database to get customers.

        #region private methods

        private Customer GetCustomer1()

        {

            var cust1 = new Customer

            {

                CustID = 1,

                FirstName = "David",

                LastName = "Giard",

                StreetAddress = "123 Main",

                City = "Boringville",

                State = "MI",

                PostalCode = "48108"

            };

 

            return cust1;

        }

 

        private Customer GetCustomer2()

        {

            var cust2 = new Customer

            {

                CustID = 2,

                FirstName = "John",

                LastName = "Smith",

                StreetAddress = "321 Elm",

                City = "Nowhere",

                State = "OH",

                PostalCode = "41001"

            };

 

            return cust2;

        }

 

        private List<Customer> GetAllCustomers()

        {

            Customer cust1 = GetCustomer1();

            Customer cust2 = GetCustomer2();

            List<Customer> allCusts = new List<Customer> {cust1, cust2};

            return allCusts;

        }

 

        #endregion

                                                            iii.      Add some Action method to the CustomerController class.  We’ll start with a List action.  Paste the following code into CustomerController.cs.

        public ActionResult List()

        {

            var allCustomers = GetAllCustomers();

            return View("List", allCustomers);

        }

                                                           iv.      In the List, we get a list of customers (all 2 of them) and return a view.  Unlike the generated code, we explicitly specify which view to return (“List”) and we pass in some extra data (allCustomers) that the view will consume.

5.       Add a view to the project.

a.       In the Solution Explorer, right-click the Views folder and select Add | New Folder.  A new folder appears in the Solution Explorer.  Rename this folder to “Customer”.

b.      In the Solution Explorer, right-click the Customer folder and select Add | New Item.  The Add New Item dialog displays.

                                                               i.      Under Categories, select Visual C#\Web\MVC.

                                                             ii.      Under Templates, select  MVC View Content Page.

                                                            iii.      In the Name textbox, enter “List”.

                                                           iv.      The Select a Master Page dialog displays. 
Figure 3
   Figure 4

1.       Navigate to the Views\Shared folder and select Site.Master.

2.       Click the OK button to add this view content page to the project.

c.       Add visual elements to the View

                                                               i.      If it is not already open, open the List view by double-clicking List.aspx in the Solution Explorer.  Click the Source tab at the bottom of the editor.

                                                             ii.      Replace the code in List.aspx with the following

<%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" AutoEventWireup="true" CodeBehind="List.aspx.cs" Inherits="TestMVC.Views.Customers.List" %>

<%@ Import Namespace="TestMVC.Views.Customers"%>

<%@ Import Namespace="TestMVC.Models"%>

 

<asp:Content ID="Content1" ContentPlaceHolderID="MainContent" runat="server">

    <h2><%= Html.Encode(ViewData["Message"]) %></h2>

    <table>

        <tr>

            <td>ID</td>

            <td>First</td>

            <td>Last</td>

            <td>Addr</td>

            <td>City</td>

            <td>State</td>

            <td>ZIP</td>

            <td>&nbsp;</td>

        </tr>

        <% foreach (Customer cust in (List<Customer>)ViewData.Model)

           { %>

            <tr>

                <td><%=cust.CustID %></td>

                <td><%=cust.FirstName%></td>

                <td><%=cust.LastName%></td>

                <td><%=cust.StreetAddress%></td>

                <td><%=cust.City%></td>

                <td><%=cust.State%></td>

                <td><%=cust.PostalCode%></td>

                <td>&nbsp;</td>

            </tr>

        <% } %>

    </table>

</asp:Content>


The above code displays a list of Customer model objects.  In a real-world example, we may choose to have the Customer model derive from a base class and only refer to the base class in the view.  This would increase the separation between our view and our model.

6.       Test the application

a.       Save and compile the solution (Build | Build Solution).   Correct any errors you find.

b.  Run the solution (Debug | Start Debugging).  Depending on the port number used by Casini, it should display in your browser with a URL such as
http://localhost:4152/Home

c.       Navigate to the List page by changing the URL to
http://localhost:4152/Customers/List
(Replace the port number if necessary)

d.      You should see a list of 2 customers in your browser
Figure 4
    Figure 4

e.      Set a breakpoint in the List method of CustomerController.cs and refresh the page to step through the code as it executes.

In this article, we created a model, view and controller from scratch and displayed them.  You can download the code for this project here.

In the next article, we will use the ID of the URL to specify a single customer.

.Net | ASP.NET | MVC
Friday, August 22, 2008 2:02:34 PM (GMT Daylight Time, UTC+01:00)
# Thursday, August 21, 2008

I have been plagued recently by a recurring problem with Visual Studio 2005.

When I attempt to exit Visual Studio, I receive the error "Visual Studio cannot shut down because a modal dialog is open" and I am not able to exit.  The rub is that there is no modal dialog - close every visible window in VS, close all other apps, resize, move and minimize VS and I can find no dialogs, modal or otherwise.  Some bit inside VS is set incorrectly convincing itself that a dialog has not yet been closed.

The only solution was to open Task Manager and kill Visual Studio.

I have automatic updates turned on so, if this is a bug, I expected it would have been fixed by now. 

I found a number of posts and threads about this issue so it is not uncommon but nearly every post did not list a solution.

After some digging, I discovered a hotfix for this problem that is not included in the normal Windows updates. 

You can read the details of the problem from Microsoft here http://support.microsoft.com/kb/936971 and you can download the hotfix here
https://connect.microsoft.com/VisualStudio/Downloads/DownloadDetails.aspx?DownloadID=7259

One important point.  Visual Studio must be closed before installing the hotfix so you may end up killing the process via Task Manager one last time.

Thursday, August 21, 2008 12:51:23 AM (GMT Daylight Time, UTC+01:00)
# Tuesday, August 19, 2008

Every year, I become a bit more accepting of my own ignorance. 

I decided a long time ago that I wanted a career in which I could continue to learn and to expand my knowledge.   Software development affords me that opportunity because it is such a large field and because it changes so rapidly. 

It is conceivable (though extremely unlikely) I could learn everything there is to know about software development and find myself completely caught up with learning for one day.  If that miracle were to occur however, I would go to bed that night and awaken to discover that things had been invented while I slept.  And I would once again be ignorant about some things.

But that day will never come.  There is an infinite amount of knowledge to be acquired and a finite amount of time in which to learn it.  The ratio of what I know to what I don't know is likely to remain small.

Yesterday, I had lunch with a guy who knows a great deal about Windows Presentation Framework (WPF).  Since I am extremely ignorant about WPF, it was a good opportunity to learn some new things.  But WPF contains so much that we could probably have lunch every day for months and I would still only scratch the surface of this framework. 

I want to know everything but I realize and accept that I cannot.  So what's the answer? 

The answer is in learning how to find the answers.  "I don't know" is an acceptable answer to any question, but "I can't do it" is not.  As new challenges and problems arise, we need to be able to figure them out - to find the answer.  Usually our experiences only get us so far.  We need help finding answers. 

Certainly the World Wide Web helps.  Many times, when faced with a problem, I've discovered an article or blog post written by a developer who faced and conquered a similar problem.  Search engines such as Google allow us to find these solutions more quickly. 

Books and magazines help as well.  They provide knowledge based on the experiences of the authors.  So do Classes and conferences.

I've found one of the best ways to improve my knowledge base is to build a network of smart people on whom I can call for tough questions.  I try to reciprocate as much as I can, but luckily software developers tend to be very generous with their ideas.

So the conclusion I've come to after all this introspection is that what we know is not nearly as important as what we are capable of finding out. 

And that's encouraging for a guy like me who will never know it all.

Tuesday, August 19, 2008 4:59:22 PM (GMT Daylight Time, UTC+01:00)
# Monday, August 18, 2008

Microsoft recently released the ASP.Net Model View Controller framework (ASP.Net MVC).  It is currently available as Community Technology Preview 3 and can be downloaded at http://www.microsoft.com/downloads/details.aspx?FamilyId=92F2A8F0-9243-4697-8F9A-FCF6BC9F66AB&displaylang=en

This article describes how to create an ASP.Net MVC application and the code that is auto-generated for you.

Creating a new ASP.Net MVC project

1.       Open Visual Studio 2008.  Create a new project: Select File | New Project.  The New Project dialog displays.

    Figure 1

a.       Under Project Type, select Visual Basic\Web or Visual C#\Web, depending on your language preference.

b.      Under Templates, select ASP.Net MVC Web Application.  This application was added when you installed the ASP.Net MVC preview.

c.       Provide an appropriate location and name for the project and solution.

d.      Click the OK button to create the project.

2.       One of the advantages of an ASP.Net MVC project is that the separation of most of the code from the user interface makes it easier to write unit tests.  Visual Studio encourages you to create unit tests for your new project by prompting you with the Create Unit Test Project dialog every time you create an MVC project.

    Figure 2

a.       If you wish, you can decline to create a Unit Test project or you can change the default project name.  Typically I do not change any defaults on this dialog.

b.      Click the OK button to create the Unit Test project.

The Folder structure of an MVC project

When you create a new MVC project, Visual Studio, generates a couple views and controllers.  If you understand how these work, you can use them to guide how you will create more views and controllers.

The solution contains two projects: an MVC project and a unit test project.

View the projects in Solution Explorer.  Select View | Solution Explorer.  The MVC project contains several folders.


    Figure 3

1.       The Content folder contains a stylesheet Site.css for this site.

2.       The Controllers folder is where you will store all your controller classes.  By default, this folder contains a single controller class - HomeController.cs.

3.       The Models folder is where you will store any model classes for your application.

4.       The Views folder contains a subfolder for each view in your application.  By default, there are two subfolders: Home and Shared. 

a.       The Shared subfolder contains a master page for the site because it is shared by multiple web pages.  Any other UI elements shared by the site belong in this folder.

b.      The Home folder contains two pages: About.aspx and Index.aspx. 

5.       As with most web applications, the root folder of this project contains a Global.asax file and a Web.Config file, which contain setup and configuration information for the application as a whole.

The Files and Folders of an MVC project

Open Global.asax and view the code.  Notice that the Application_Start method (which fires once, at the startup of the web application) contains a call to the RegisterRoutes method. The RegisterRoutes method tells the MVC framework how to interpret a URL. 

public static void RegisterRoutes(RouteCollection routes)

{

    routes.IgnoreRoute("{resource}.axd/{*pathInfo}");

 

    routes.MapRoute(

        "Default",                                              // Route name

        "{controller}/{action}/{id}",                           // URL with parameters

        new { controller = "Home", action = "Index", id = "" }  // Parameter defaults

    );

 

}

The routes.MapRoute method accomplishes this.  In this case, a “Default” route collection is created that interprets a URL with syntax like “{controller}/{action}/{id}”. 

·         The first part of the URL specifies the controller to use.  MVC looks in the Controllers folder for a class that inherits from System.Web.Mvc.Controller with a name that matches the controller specified in the URL.

·         The second part of the URL specifies the action to take.  The action is the public method within this controller that will be called. 

·         The third part of the URL specifies an id to pass to the action method.  This can be used to further customize the action.  For example, we could use the id as a filter to dynamically look up a single row in a database.

The routes.MapRoute method also allows us to specify defaults if no controller or action or id is specified in the URL.  If any of these are omitted from the URL, MVC will use the defaults specified in the third parameter of routes.Maproute.  In this case the object new { controller = "Home", action = "Index", id = "" } tells MVC the following:

·         If no Controller is specified in the URL, assume the Home controller (i.e., look for a class named “Home.cs” in the Controllers folder).

·         If no Action is specified in the URL, assume the Index action (i.e., look for a public method “Index” in the Home.cs class).

·         If no ID is specified in the URL, assume a blank ID (i.e., any code looking for an ID will retrieve an empty string).

Open the view files: Home.aspx and About.aspx and notice that there is no code behind in either.  This is because ASP.Net MVC applications do not execute the page life cycle.  All the code for this application is in the controllers.  These view pages contain only visual elements.

Open the controller class: Homecontroller.cs.  As we mentioned before, this class derives from the System.Web.Mvc.Controller class and it contains two methods: Index and About.

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

using System.Web.Mvc;

 

namespace TestMVC.Controllers

{

    public class HomeController : Controller

    {

        public ActionResult Index()

        {

            ViewData["Title"] = "Home Page";

            ViewData["Message"] = "Welcome to ASP.NET MVC!";

 

            return View();

        }

 

        public ActionResult About()

        {

            ViewData["Title"] = "About Page";

 

            return View();

        }

    }

}

“HomeController” is the name of the class to implement the Home controller.  This is typical of how MVC works – developers follow naming conventions in order tell the framework where to find the code to run.  In the case of controllers, we implement a controller by sub-classing the System.Web.Mvc.Controller class, naming this subclass “xxxController” (where xxx is the Controller name) and placing that subclass in the Controllers folder of our MVC project.  If we wanted to call a controller named “David”, we would create a System.Web.Mvc.Controller class named “DavidController” into the Controllers folder.  This process is known as “convention over configuration”, meaning that the framework knows where to find code based on the names we use.

Let’s look closely at the Index method.  Recall that the method in the controller is the Action that is specified in the URL.  So the Index method will be called if the Index action is specified.

ViewData is a dictionary collection that is a property of every Controller object.  We can add or update items in this collection by syntax such as
ViewData["Title"] = "Home Page";

By placing items in this collection, we make them available to the view when it is called.

The view (remember this is the UI that the user will see) is returned from this method.  The following line returns the default view.
return View();

We know it is the default view because the statement did not specify the name of the view.  The default view has the same name as the Action called.  In this case, we are returning the Index view.  Once again, MVC uses conventions to determine where to find the view.  All views associated with a given controller are stored in a subfolder named for that controller beneath the Views folder.  In this case, we are using the Home controller, so we look for views in the Views\Home folder of the project.  The view itself is a file with the same name as the view and with an extension of “.aspx” or “.ascx”.  In this case, we are looking for the default view (Index) of the Home controller.  MVC renders the page Views\Home\Index.aspx for this view.  Again, the developer uses naming conventions to tell the framework where to find items.

Open Index.aspx.  Notice it displays the message stored in the ViewData dictionary by the controller.
<%
= Html.Encode(ViewData["Message"]) %>

However, it contains no other code, because all logic is handled by the controller.

Conclusion

Creating a new ASP.Net MVC project is as easy as creating any other Visual Studio project.  Learning the paradigm that the MVC framework uses can be a challenge; but the samples created automatically with a new project can ease that learning curve.

In the next article, we will add a new controller and view to a project.

.Net | ASP.NET | MVC
Monday, August 18, 2008 4:37:58 PM (GMT Daylight Time, UTC+01:00)
# Sunday, August 17, 2008

The Model-View-Controller (MVC) design pattern has existed for years.  (http://heim.ifi.uio.no/~trygver/themes/mvc/mvc-index.html)  ASP.Net developers have been implementing it for years either through their own custom code or via third-party frameworks such as Monorail (http://www.castleproject.org/monorail/index.html).

Recently, Microsoft released the ASP.Net MVC framework to give web developers the option of using this design pattern without the need for a lot of “plumbing” code or the use of a third-party framework.

The Model-View-Controller design pattern splits an application into three distinct parts called (you guessed it) “Models”, “Views” and “Controllers”. 

A Model represents the stateful data in an application.  These are often represented as objects such as “Customer” and “Employee” that represent an abstract business object.  For persistent data, the Model may save and retrieve data to and from a database.  Public properties of these objects (for example, “LastName” or “HireDate”) represent their state at any give time.  The model objects have no visual representation and no knowledge of how they will be displayed on screen. 

A View is the application’s user interface (UI).  In a web application, this is the web page that the user sees and clicks and interacts with directly.  The View can display data but it has no knowledge of where the data it displays comes from.

A Controller is the brains of your application.  It links the Model to the View.  It handles communication between the other two parts of the application.  It is smart enough to detect when a data needs to be retrieved (from the model) and refreshed (in the view).  It sends updated data from the view back to the model so that the model can persist it. 

Below is Trygve M. H. Reenskaug’s diagram of the relationship between these thee parts.

    Figure 1

This separation of the various concerns of the application encourages developers to create loosely-coupled components.

Much of the communication to the controller occurs by raising events in the view, which keeps the controller loosely coupled from the other parts.  However the real advantage of the MVC pattern is that, because they only communicate through the controller, the model and view are very loosely coupled.  This provides the following advantages to an MVC application.

  • Loosely-coupled applications have fewer dependencies, making it easier to switch the user interface or backend at a later time.
  • Applications can be tested more easily, because so little code is in the view.  We can test our code by writing unit tests against the controller.

In the next article, we will look at how to create a Microsoft ASP.Net MVC application.

.Net | ASP.NET | MVC
Sunday, August 17, 2008 2:03:21 PM (GMT Daylight Time, UTC+01:00)
# Saturday, August 16, 2008
# Friday, August 15, 2008

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools for managing and deploying SQL Server databases. 

In this article, we will discuss how to migrate data from one database to another.   Data Dude provides the Data Compare tool for this purpose.

In order to use the Data Compare tool, the following conditions must be true

1.       Data exists in a source table.  You want to migrate that data to a table of the same name in a different database.

2.       Both tables must have the same structure.

3.       Both tables must have a primary key to uniquely identify rows.

Follow the steps below to migrate data with the Data Compare tool.

1.       Launch Visual Studio 2008

2.       Select Data | Data Compare | New Data Compare.  The New Data Comparison dialog displays

  Figure 1

a.       When migrating data from a table in one database to another, the database you intend to update is known as the “Target Database”.  The other database is known as the “Source Database”.  In the New Data Comparison dialog, select the Source Database and Target Database connections.  If you have not created a Visual Studio connection to these databases in Visual Studio, you can click the New Connection button to create them.

b.      The New Data Comparison dialog contains checkboxes that allow you to specify which rows you want to see and compare.  I don’t usually change these (they are all checked by default) but it may speed up the process to clear the Identical Records checkbox.

c.       Click the Next button to advance to the second screen of the wizard.

    Figure 2
On this screen, you can choose which tables to compare.  Usually I am only interested in one or two tables, so I clear the rest of the checkboxes.  If I have a million rows in my customer table and I’m not interested in migrating any of those rows, I can save a lot of processing time by un-checking the customer table.

d.      Click the Finish button to display the Data Compare window.

3.       The Data Compare window consists of two panes: the Object List on top and the Record Details at the bottom.

    Figure 3

a.       The object list displays each table or view as a single row with columns summarizing the number of new, removed, changed and identical rows.  Rows are matched on their primary key.

b.      Click a row in the object list to display details in the record details pane.  Here you can click a tab to view the rows that are new, missing or changed.  Checking the checkbox next to a record flags it to the Data Compare tool, meaning you want to update the target database to match the same row in the source database.  This may result in an INSERT, UPDATE, or DELETE statement depending on the tab on which the record is listed. 

c.       For a record to be flagged for update, both the table and the record must be checked.

4.       After checking all the rows you wish to update, click the Write Updates toolbar button to commit your changes to the target database. 

5.       Alternatively, you can click the Export To Editor toolbar button to generate a SQL script that you can run in the SQL Server query editor.  This method requires an extra step but has the following advantages

a.       You can modify the script before running it.

b.      You can send the script to someone else to run.

c.       You can view the script to learn what Data Dude is doing.  It’s interesting to note that constraints on each table are dropped before copying data, then created after the data is copied.  This speeds up the process.  Also, note the use of transactions to prevent incomplete data copies.  Below is a sample script updating data in one table.
/*
This script was created by Visual Studio on 8/15/2008 at 8:57 AM.
Run this script on dgiard.Test_QA.dbo to make it the same as dgiard.Test_Dev.dbo.
This script performs its actions in the following order:
1. Disable foreign-key constraints.
2. Perform DELETE commands.
3. Perform UPDATE commands.
4. Perform INSERT commands.
5. Re-enable foreign-key constraints.
Please back up your target database before running this script.
*/

SET NUMERIC_ROUNDABORT OFF
GO
SET XACT_ABORT, ANSI_PADDING, ANSI_WARNINGS, CONCAT_NULL_YIELDS_NULL, ARITHABORT, QUOTED_IDENTIFIER, ANSI_NULLS ON
GO
/*Pointer used for text / image updates. This might not be needed, but is declared here just in case*/
DECLARE @pv binary(16)
BEGIN TRANSACTION
ALTER TABLE [dbo].[OrderDetails] DROP CONSTRAINT [FK_OrderDetails_Orders]
ALTER TABLE [dbo].[OrderDetails] DROP CONSTRAINT [FK_OrderDetails_Products]
ALTER TABLE [dbo].[Orders] DROP CONSTRAINT [FK_Orders_Customers]
DELETE FROM [dbo].[ProductTypes] WHERE [ProductTypeID]=N'5646953f-7b89-4862-bcf3-bf53450d28bb'
INSERT INTO [dbo].[ProductTypes] ([ProductTypeID], [ProductTypeName]) VALUES (N'7beb0d99-d034-41b9-bbf7-f9cdcdbedc30', N'Furniture')
INSERT INTO [dbo].[ProductTypes] ([ProductTypeID], [ProductTypeName]) VALUES (N'abc19a14-5968-4c5f-9f0f-4debc034cb90', N'Hardware')
INSERT INTO [dbo].[ProductTypes] ([ProductTypeID], [ProductTypeName]) VALUES (N'b9e446ed-eeb1-4334-b191-c70a55ef1a05', N'Books')
ALTER TABLE [dbo].[OrderDetails] ADD CONSTRAINT [FK_OrderDetails_Orders] FOREIGN KEY ([OrderID]) REFERENCES [dbo].[Orders] ([OrderID])
ALTER TABLE [dbo].[OrderDetails] ADD CONSTRAINT [FK_OrderDetails_Products] FOREIGN KEY ([ProductID]) REFERENCES [dbo].[Products] ([ProductID])
ALTER TABLE [dbo].[Orders] ADD CONSTRAINT [FK_Orders_Customers] FOREIGN KEY ([CustomerID]) REFERENCES [dbo].[Customers] ([CustID])
COMMIT TRANSACTION

The Data Compare tool is a simple tool for accomplishing a useful task.  Since I discovered it, it has saved me a lot of time setting up new data environments.

 

.Net | SQL Server | VSTS
Friday, August 15, 2008 2:02:53 PM (GMT Daylight Time, UTC+01:00)
# Thursday, August 14, 2008

Writing Unit Tests is an essential step in developing robust, maintainable code.  Unit Tests increase quality and mitigate the risk of future code changes.  However, relatively few developers take the time to write unit tests for their stored procedures.  The primary reason for this is that few tools exist to test stored procedures.

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools to help developers write unit tests against SQL Server stored procedures.  The tool integrates with MSTest, wich is a testing framework many developers are already using for their other unit tests.

In order to write unit tests for your stored procedures, those stored procedures must be in a database project.  For information on how to create a database project from a SQL Server database see: http://www.davidgiard.com/2008/08/11/DataDudeTutorial1CreatingADatabaseProject.aspx

This document describes how to create a database unit test.

1.       Launch Visual Studio and open your Database Project.

2.       Open the Schema View.  Select View | Schema View.

3.       Right-click a stored procedure and select Create Unit Test from the context menu.  The Create Unit Tests dialog displays.

    Figure 1

4.       Check the checkboxes next to all the stored procedures for which you wish to create unit tests.  Select the .Net language (Visual Basic .Net or C#) in which you want the automatic code to be generated.  You won’t be modifying this code so it isn’t that important, but I tend to keep all my code in the same language, so you may as well choose your favorite language here.  Enter a meaningful name for the Unit Test Project and class.  I like to name my Unit Test projects the same as my database project, followed by “Tests” or “UnitTests”.  If this is a new Database Unit Test Project, the Database Unit Test Configuration dialog displays.

    Figure 2

5.       The Database Unit Test Configuration dialog allows you to specify what you want to occur when you run these unit tests.  The dialog is organized into the following sections.

a.       Database connections

                                                               i.      Execute unit tests using the following data connection
This is the database against which tests will run.  Typically I set this to my Development or QA database.

                                                             ii.      Use a secondary data connection to validate unit tests
You may specify a different database to validate the syntax of your unit tests and test that all the objects you refer to exist.  I can only think this might be good if you are writing tests while disconnected from your testing database, but I never set this option.

b.      Deployment

                                                               i.      Automatically deploy the database project before unit tests are run
To save manual steps, you may wish to check this box and deploy the database project to the database each time you run your unit tests.  This slows down the testing step so I do not select this option.  I prefer to deploy my changes once; then run my unit tests – sometimes several times.

c.       Database state

                                                               i.      Generate Test data before Unit tests are run
It is often useful to populate your database with some test data prior to your test run.  Use this button to do this.

6.       After creating your unit tests, you need to modify each one and specify what you are testing.  Open the Solution Explorer (View | Solution Explorer).

7.       Double-click the unit test class to open it in the unit test designer. 

    Figure 2

8.       The Unit Test Designer contains some controls and two panes as described below. 

a.       A class can contain multiple tests.  The first control is a dropdown that allows you to select which test you are designing.

b.      To the right of the Test Name dropdown is another dropdown that allows you to specify what part of the test you are writing.  You can choose between the test itself, the “Pre-test” (which runs before the test is executed) and the “Post-test” (which runs after the test has completed – successfully or unsuccessfully).

   Figure 3

c.       Further to the right are three buttons that allow you to add a new test or to delete or rename the currently active test.

d.      Below the controls is the test editor.  This is where you will write your test.  You will write your test in T-SQL and Data Dude provides some stub code to get you started.  Write SQL statements that call your stored procedure and return one or more results.

e.      Below the test editor is the Test Conditions pane.  It is here that you enter your assertions. 

    Figure 4

                                                               i.      You can test for a given result set having 0 rows, 1 or more rows, or an exact number of rows. 

                                                             ii.      You can also test if a specific column and row in a given result set evaluates to a given value. 

                                                            iii.      Click the “+” button to add new assertions.  Highlight an existing assertion row and edit the row or click the “x” button to remove the assertion. 

                                                           iv.      Use the properties window to modify properties of the assertion.  Many assertions are based on a given resultset.  When I first started writing unit tests, I found it difficult to determine which resultset was which.  Basically, any line in your SQL script that begins with the word “SELECT” creates a resultset.  Each resultset is numbered, beginning with 1, in the order it is created in your script.  I sometimes find it useful to copy the SQL code and paste it into SQL Management Studio query window and run it.  Each resultset then appears in a separate grid in the Results pane.  Looking at these grids allows me to more easily see a sample result set and in what order they are created.

f.        You run your database unit tests the same ways you run any MS Test unit test.  One way to run the tests is to open the Test List Editor (Test | Windows | Test List Window), check the tests you want to run, and click the Run Checked Test toolbar button.  Tests in which all assertions prove true are passed; all others are failed.

Using Data Dude, you can extend your unit tests to cover your database objects and, therefore, improve the overall quality and maintainability of your code.

.Net | SQL Server | VSTS
Thursday, August 14, 2008 2:56:47 PM (GMT Daylight Time, UTC+01:00)
# Wednesday, August 13, 2008

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools for managing and deploying SQL Server databases. 

In our last tutorial, we described how a database developer would use the Schema Compare tool to update a database project with changes to a SQL Server database.

This article describes how to use the Schema Compare tool to push those changes out to a different SQL Server database.  There are two scenarios where you would do this. 

In Scenario 1, a developer has a local copy of the development database and wishes to get the latest updates to the database. 

In Scenario 2, a database administrator (DBA) or build master who is charged with migrating database changes from one environment to the next.  Just as .Net and web code gets regularly migrated from a development environment to a QA or Production environment, database object code must also be migrated, and that migration generally must be kept in sync with all code that depends on those database objects.

We start this process by launching Visual Studio and opening the database project.  If a source code repository such as TFS is used, we need to get the latest code from the repository.

The database to which we wish to write the changes is known to Data Dude as the “target database”.  We need to make sure that a connection exists in Visual Studio to the target database.  This is a one-time step and you can use the Server Explorer (View | Server Explorer) to create a connection.

The following steps describe how to propagate changes to the database.

1.       Launch Visual Studio and open the database project.  Get the latest source code from your source code repository.

2.       From the Visual Studio menu, select Data | Schema Compare | New Schema Compare.  The New Schema Compare dialog displays.

    Figure 1

3.       Under Source Schema, select the Project radio button and select your database project from the dropdown list.

4.       Under Target Schema, select the Database radio button and select the connection to your database from dropdown list.

5.       Click the OK button to display the Schema Compare window.

    Figure 2

6.       The Schema Compare window lists every object that exists in either the database or the database project.  The objects are grouped in folders by object type (Tables, views, stored procedures, etc.)  You can expand or collapse a folder to view or hide objects of that type.  The important column is “Update Action” which describes what will happen if you write the updates to the target.

a.       Objects that exist in the source (the project) but not in the target (the database) were likely recently added after the last synchronization.  By default, the Update Action will be “Create” meaning the object will be created in the target database.

b.      Objects that exist in both the source and the target will have an Update Action of “Update” if they have been modified in the database since the last synchronization or “Skip” if they have not.

c.       Objects that exist in the destination (the database) but not in the source (the project) were likely dropped after the last synchronization. By default, the Update Action will be “Drop” meaning the object will be removed from the database.

7.       On a database with many objects, it is useful to view only the objects that have changed since the last synchronization.  To do this, click the Filter toolbar button and select Non Skip Objects. 

    Figure 3

8.       If you wish, you can modify the Update Action on objects by selecting the dropdown in the “Update Action” column.  Some actions are grayed out because Data Dude will not allow you to perform any action that would violate referential integrity rules. 

9.       After you have set the “Update Action” of every object appropriately, you have a couple options.

a.       You can migrate your changes immediately to the target database by clicking the “Write Updates” toolbar button.  Click Yes at the confirmation to write the updates to the database project.

b.      Alternatively, you can export your changes to a SQL script by clicking the Export To Editor toolbar button.  This will create a single text file containing SQL script that you can run from a query window of SQL Server Management Studio.  This is useful if you need to make changes to the script prior to executing.  I have used this technique when my database contains views or stored procedures that refer to remote servers and I want to modify the name of the server before migrating the object.

Alternatively, you can deploy changes from a database project to a database by “Deploying” the project (select Build | Deploy Solution).  This deploys your changes using the settings found on the Build tab of the project properties page.  This method requires fewer steps, but it is less flexible than the method described above.  In particular, it does not allow you to select which objects are deployed or export and modify the script of database changes.

In the next article, we will discuss how to use Data Dude to write Unit Tests against SQL Server stored procedures.

.Net | SQL Server | VSTS
Wednesday, August 13, 2008 12:46:54 PM (GMT Daylight Time, UTC+01:00)
# Tuesday, August 12, 2008

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools for managing and deploying SQL Server databases. 

In our last tutorial, we described how to create a new database project based on an existing SQL Server database.  As the source database changes, you will want to update the database project to reflect those changes.  This article describes how to use the Schema Compare tool to import database schema changes into your database project.  The steps in this article are typically performed by a database administrator (DBA) or database developer who is charged with creating tables, views, functions and stored procedures.

The Schema Compare tool can be used to display and manage differences between two databases, between two database projects, or between a database and a database project.  Most of the time, I use it to compare a database with a database project. 

After making a change to a database schema (for example, adding a new table or adding a new column to a table), use the Schema Compare tool as described below to update an existing database project with these changes.

1.       Launch Visual Studio and open your Database Project. (For info on how to create a database project from a SQL Server database see: http://www.davidgiard.com/2008/08/11/DataDudeTutorial1CreatingADatabaseProject.aspx )

2.       From the Visual Studio menu, select Data | Schema Compare | New Schema Compare.  The New Schema Compare dialog displays.

    Figure 1

3.       Under Source Schema, select the Database radio button and select the connection to your database from dropdown list.

4.       Under Target Schema, select the Project radio button and select your database project from the dropdown list.

5.       Click the OK button to display the Schema Compare window.

    Figure 2

6.       The Schema Compare window lists every object that exists in either the database or the database project.  The objects are grouped in folders by object type (Tables, views, stored procedures, etc.)  You can expand or collapse a folder to view or hide objects of that type.  The important column is “Update Action” which describes what will happen if you write the updates to the target.

a.       Objects that exist in the source (the database) but not in the target (the project) were likely added to the database after the last synchronization.  By default, the Update Action will be “Create” meaning the object will be created in the database project.

b.      Objects that exist in both the source and the target will have an Update Action of “Update” if they have been modified in the database since the last synchronization or “Skip” if they have not.

c.       Objects that exist in the destination (the project) but not in the source (the database) were likely dropped from the database after the last synchronization. By default, the Update Action will be “Drop” meaning the object will be removed from the database project.

7.       If you are updating your database project frequently, most of the objects will be unchanged and marked “Skip”.  On a database with many objects, it is useful to view only the objects that have changed since the last synchronization.  To do this, click the Filter toolbar button and select Non Skip Objects 

    Figure 3

8.       At this point, you can view differences and you may wish to modify the Update Action of some objects.

a.       If you click on an object row in the Schema Compare window, the SQL definition code of both the source and destination version appears in the Object Definition window.  Any differences between the two versions will be highlighted (changed lines in darker blue; new lines in darker green).

b.      If you like, you can modify the Update Action any object by selecting the dropdown in the “Update Action” column.  Some actions are grayed out because Data Dude will not allow you to perform any action that would violate referential integrity rules.  If several developers are sharing the same development database, you may wish to skip those objects on which you are not working.  You may also decide that some objects are ready to share with the rest of the team and others are not fully tested and should be skipped.  It is possible to change the Update Action of every object of a given type by right-clicking the type folder and selecting the desired action to apply to all objects of that type.

9.       After you have set the “Update Action” of every object appropriately, you can migrate your changes to the database project by clicking the Write Updates toolbar button.  Click Yes at the confirmation to write the updates to the database project.

    Figure 4

10.   If you are using a source control repository, such as TFS, you will want to check in your changes.

In the next article, we will discuss how to use the Schema Compare tool to write changes to a new database environment.

.Net | SQL Server | VSTS
Tuesday, August 12, 2008 1:40:04 PM (GMT Daylight Time, UTC+01:00)
# Monday, August 11, 2008

Microsoft Visual Studio Team System 2008 Database Edition (aka “Data Dude”) provides tools for managing and deploying SQL Server databases.  In order to use Data Dude to manage an existing SQL Server database, the first step is to create a database project. 

There are a couple key points you will need to know before using Data  Dude.

1.       The current version of Data Dude only works on SQL Server 2000 and SQL Server 2005.  Visual Studio 2008 Service Pack 1 should provide support for SQL Server 2008.  I will describe an example using SQL Server 2005.

2.       The validation engine in Data Dude requires that you install either SQL Server or SQL Express on the same machine on which Data Dude is installed. 

3.       You must grant “Create database” rights in this database engine to the currently logged-in user.

Now, let’s discuss how to create a database project to manage an existing SQL Server 2005 database.

1.       Open Visual Studio. 

2.       Select File | New Project… The New Project dialog displays

3.       Under Project Type, select Database Projects\Microsoft SQL Server

4.       Under Templates, select SQL Server 2005 Wizard.

5.       Enter a meaningful name and location for this project. 
Typically, my databases have names like “AdventureWorks_Dev” and “AdventureWorks_QA” which describe both the data and the environment to which the data belongs.  Because a single database project is used for all environments, I name my database project to describe the data and follow it with “DB” to make it obvious it is a database project.  In the above example, I would name my database project “AdventureWorksDb”.  In this exercise, I’ll create a project named “TestDB”.
New Project dialog
  
Figure 1

6.       The New Database Project Wizard displays with the Welcome screen active.

7.       At the Welcome screen, click the Next button to advance to the Project Properties screen.
Project Properties screen
  Figure 2

8.       I almost never change the options on the Project Properties screen.   If my database contains any stored procedures or functions written in C# or VB.Net, I will check the Enable SQLCLR checkbox. 

9.       Click the Next button to advance to the Set Database Options screen.
Set Database Options screen
  Figure 3

10.   The options on the Set Database Options screen correspond to the settings you will find in SQL Server Management Studio when you right-click a database and select Properties.  The defaults in the database project wizard are also the defaults in SQL Server.  Since I seldom override these defaults in SQL Server, there is usually no reason to change them on this screen.

11.   Click the Next button to advance to the Import Database Schema screen.
Import Database Schema screen
  Figure 4

12.   On the Import Database Schema screen, check the Import Existing Schema checkbox.  This enables the Source database connection dropdown.  If you already have created a connection to your database, select it from the dropdown.   If you have not yet created a connection, click the New Connection button to create one now.  The process for creating a database connection in Visual Studio hasn’t changed for several versions of the product so I won’t repeat it here.  However it is worth noting that, although Data Dude requires that you have a local installation of SQL Server, the database you connect to here can be located on any server to which you have access.  I always usually connect to the Development database because this is the first database I create for an application.

13.   Click the Next button to advance to the Configure Build and Deploy screen.
Configure Build and Deploy screen
  Figure 5

14.   The Configure Build and Deploy screen contains settings that will take effect when you “deploy” your database project.  Deploying a database project writes changes to the schema of a target database (specified in the Target database name field of this screen) and is accomplished by selecting the menu options Build | Deploy with the database project open and selected.  Deploying is most useful when each developer has his own copy of the development database and needs a quick way to synchronize his schema with a master copy.  

15.   Click the Finish button to create the database project initialized with schema objects found in the source database and with the settings you chose in the wizard screens.

16. After Visual Studio finishes creating the database project, view the objects in the Solution Explorer (Select View | Solution Explorer).  You should see a "Schema Objects" folder in the project containing a subfolder for each type of database object.  Open the "tables" subfolder to view you will see files containing scripts for each table in your database.  Double-click one of these script files to see the SQL code generated for you.
Database project in Solution Explorer
    Figure 6

17. If you use a source control repository, such as TFS, you will want to check this project into the repository to make it easier to share with others.

As you can see, when you use the wizard to create your project, most of the work is done for you.  You are able to change the default settings, but in most cases this is not necessary.  Often, the only change I make on the wizard screens is when I select a database connection.

In the next article, we will discuss how to use the Schema Compare tool to bring data changes into or out of your database project.

.Net | SQL Server | VSTS
Monday, August 11, 2008 3:12:22 PM (GMT Daylight Time, UTC+01:00)
# Sunday, August 10, 2008

Visual Studio Team System 2008 Database Edition is a mouthful to say, so a lot of people affectionately call it “Data Dude”.

Data Dude provides a set of tools integrated into Visual Studio that assist developers in managing and deploying SQL Server database objects.

There are four tools in this product that I have found particularly useful: the Database Project; the Schema Compare tool; the Data Compare Tool; and Database Unit Tests.

A Database Project is a Visual Studio project just as a class library or ASP.Net web project is.  However, instead of holding .Net source code, a Database Project holds the source code for database objects, such as tables, views and stored procedures.  This code is typically written in SQL Data Definition Language (DDL).  Storing this code in a Database Project makes it easier to check it into a source code repository such as Team Foundation Server (TFS); and simplifies the process of migrating database objects to other environments.

The Schema Compare tool is most useful when comparing a database with a Visual Studio Database Project.  Developers can use this tool after adding, modifying or deleting objects from a database in order to propagate those changes to a Database Project.  Later, a Database Administrator (DBA) can compare the Database Project to a different database to see what objects have been added, dropped or modified since the last compare.  The DBA can then deploy those changes to the other database.  This is useful for migrating data objects from one environment to another, for example when moving code changes from a Development database to a QA or Production database.

The Data Compare is another tool for migrating from one database environment to the next.  This tool facilitates the migration of records in a given table from one database to another.  The table in both the source and destination database must have the same structure.  I use this when I want to seed values into lookup tables, such as a list of states or a list of valid customer types that are stored in database tables.

Unit tests have increased in popularity the last few years as developers have come to realize their importance in maintaining robust, error-free code.  But unit testing stored procedures is still relatively rare, even though code in stored procedures is no less important than code in .Net assemblies.  Data Dude provides the ability to write unit tests for stored procedures using the same testing framework (MS Test) you used for unit tests of .Net code.  The tests work the same as your other unit tests – you write code and assert what you expect to be true.  Each test passes only if all its assertions are true at runtime.  The only difference is that your code is written in T-SQL, instead of C# or Visual Basic.Net. 

There are some limitations.  In order to use Data Dude, you must have either SQL Server 2008 or SQL Express installed locally on your development machine and you (the logged-in user) must have "Create Database" rights on that local installation.  To my knowledge, Data Dude only works with SQL Server 2000 and 2005 databases.  Plans to integrate with SQL Server 2008 have been announced but I don't know Microsoft's plans for other database engines.   I also occasionally find myself wishing Data Dude could accomplish its tasks more easily or in a more automated fashion.  I wish, for example, I could specify that I always want to ignore database users in a database and always want to migrate everything else when using the Schema Compare tool.  But overall, the tools in this product have increased my productivity significantly.  Nearly every application I write has a database element to it and anything that can help me with database development, management and deployment improves the quality of my applications.

.Net | SQL Server | VSTS
Sunday, August 10, 2008 1:34:41 PM (GMT Daylight Time, UTC+01:00)
# Saturday, August 9, 2008

When applications service a large number of simultaneous users, the developer needs to take this into account and find ways to ease the application’s bottlenecks. 

One way to help speed up a stressed application is to load into memory resources that will be requested by multiple users.  Reading from memory is much faster than reading from a hard drive or a database, so this can significantly speed up an application. 

However, each computer contains a finite amount of memory, so there is a limit of how much data you can store there.

Microsoft Distributed Cache (code named "Velocity") attempts to address this problem.  It allows your code to store data in an in-memory cache and it allows that cache to be stored on multiple servers, thus increasing the amount of memory available for storage. 

Velocity even ships with a provider that allows you to store a web site's session state, making it possible to increase the amount of memory available to your session data.

Microsoft has not yet published a release date for Velocity, but it is available as a Community Technology Preview (CPT).  You can download these bits and read more about it at http://code.msdn.microsoft.com/velocity.

The current CTP is not production ready - I had trouble keeping the service running on my Vista machine - but the technology shows enough promise that it is worth checking out.  When the glitches are fixed, this will make .Net an even more appealing choice for developing enterprise applications.

Saturday, August 9, 2008 2:38:56 PM (GMT Daylight Time, UTC+01:00)
# Friday, August 8, 2008

I am a recent convert to Agile methodologies. 

Until last year, I worked for a large consulting company that had established a solid reputation using a waterfall approach to deliver solutions.

My current employer is committed to the agile methodology SCRUM.  They have developed their own variation of SCRUM and several consultants here have even made a name for themselves delivering presentations on this methodology to customers and at conferences.

So it's only natural that I have been engrossed in SCRUM since joining.  Nine months of agile software development have sold me on its benefits. 

The biggest advantage I see to SCRUM is the short delivery schedule pushed by the sprints.  For those who don’t know, a sprint is a set of features scheduled for delivery in a short period of time (typically 1-4 weeks).  A sprint forces (or at least encourages) frequent delivery of working software and provides a great feedback loop to the development team. 

When users can actually see, touch and use functioning software, they don't just get value more quickly - they are able to evaluate it more quickly and provide valuable feedback.  That feedback might be a rethinking of original assumptions; it might be new ideas sparked by using the software; it might be a reshuffling of priorities, or it might be a clarification of some miscommunication between the users and the developers.  It will probably be several of these things. 
That miscommunication issue is one that occurs far too often on software projects.  Catching these misunderstandings early in the life of an application can save a huge amount of time and money.  We all know that the cost of making a change to software goes up exponentially the later that change is made.

By delivering something useable to the customer several times a month, we are providing value to the customer in a timely manner.  At best, this value comes in the form of software that enhances their ability to perform their job.  At worst, we provide something they didn't ask for.  But this worst-case scenario also adds value because we can use the delivery to clarify the misunderstandings and poor assumptions that leaked through the design.

I think back to the last waterfall project in which I was involved.  Our team was charged with designing and building an integration layer between an e-commerce web application (that was being designed at the same time) and dozens of backend systems (that were in a state of constant flux).  We spent months designing this integration layer.  During these months, the systems with which we planned to integrate changed dozens of times.  These changes included adding or removing fields; placing a web service in front of an existing interface; and completely redesigning and rewriting backend systems.

Each of these changes forced us to re-examine all the design work we had done and to modify all our documents to match the changed requirements.  In some cases, we had to start our design over from scratch.

An agile approach would have helped immensely.  Instead of designing everything completely before we started building anything, we could have minimized changes by designing, building, and deploying the integration service to one back-end system at a time.  By selecting a single integration point, we might have been able to quickly deliver a single piece of functionality while other backend systems to stabilize. 


I'm not going to suggest that agile is the appropriate methodology for every software project or that no other methodologies have value.  My former employer delivered countless successful projects using waterfall techniques. 

But it pays to recognize when agile will help your project and it is definitely a useful tool for any developer, architect or project manager to have in his or her toolbox.

Friday, August 8, 2008 3:58:27 PM (GMT Daylight Time, UTC+01:00)
# Thursday, August 7, 2008

Microsoft recently released the Managed Extensibility Framework (MEF) which allows developers to add hooks into their applications so that the application can be extended at runtime.

Using MEF is a 2-step process: one step is performed by the application developer who adds attributes or code at defined points in the application.  At these points, the application searches for extensible objects and adds or call them to the application at runtime. 
The second step is by third-party developers who use the MEF application programming interface (API) to define classes in an “extension” assembly as extensible so that they will be discoverable by the above-mentioned applications.

The two steps are loosely-coupled, meaning neither the application nor the extension assembly needs to know anything about the other.  We don't even need to set a reference from one project to another in order to call across these boundaries.

I can think of two scenarios where this technology would be useful.

In scenario 1, an independent software vendor develops and sells a package with many pluggable modules.  Customers may choose to buy and install one module or all modules in the package.  For example, an Accounting package may offer General Ledger, Accounts Payable, Accounts Receivable Payroll and Reporting modules, but not all users will want to pay for every module.  By using MEF, the software could search a well-known directory for any module assemblies (flagged as extensions by MEF) and add to the menu only those that are installed.  With MEF in place, more modules could be added at a later time with no recompiling.

In scenario 2, developers create and deploy an application with a given set of functionality and create points at which other developers are allowed to extend the application using MEF.  By publishing these extendable points, they can allow developers to add functionality to the application without modifying or overriding the original source code.  This is a much safer way of extending functionality.  Extensions could be anything from new business rules or workflow to additional UI elements on forms.

All the extensions happen at runtime and MEF gives developers the ability to add metadata to better describe their extension classes.  By querying this metadata, we can conditionally load only those extensions that meet expected criteria.  The best part of this feature is that we can query a class's metadata without actually loading that class into memory.  This can be a huge resource saving over similar methods, such as Reflection.

MEF is currently released as Community Technology Preview (CPT), so the API is likely to change before its final release.  You can download the CTP and read more about it at http://code.msdn.microsoft.com/mef.   By learning it now, you can be prepared to add extensibility to your application when MEF is fully released.

Thursday, August 7, 2008 8:37:42 PM (GMT Daylight Time, UTC+01:00)