Friday, May 30, 2008

Web Site vs. Web Application Project

In Visual Studio 2005 and later, there are some significant differences between a "web site" and a "web application project" (WAP).

Project structure
A web site has no project file. The "site" is simply the collection of files in the site's directory. Project/binary references and other configuration settings are stored in the web.config file (poor form in my opinion).

A web application project does have a project file, it's treated as a class library project. However, the visual studio template for a WAP provides some additional things such as what types of items are visible in the "Add new item" dialog (i.e. web form, master page, user control, web.config, etc) and configuration of debugging such as the settings for the development web server or IIS.

Codebehind/Codefile attribute
In a WAP, the markup directive (@Page, @Control, etc.) contains the "Codebehind" attribute. This is actually meaningless to the ASP.NET runtime, it's a linking attribute used by visual studio to indicate what the code-behind file is for the markup file.
In a site, the "Codefile" attribute is used. This is similar to the "Src" attribute. (I've experimented with the two and can't find a significant difference between them.) It tells the ASP.NET runtime what source code file should be compiled together with the markup. This is what links a markup file to a code behind file in the dynamic architecture of web sites.

In both a site and WAP, the markup (AS?X files) are dynamically compiled. There is an exception but it's an advanced topic. All code files (including page code-behind) for a WAP are always pre-compiled. In a site, nothing is pre compiled. The ASP.NET runtime will compile everything in the App_Code directory into one DLL and each page will get compiled into its own DLL. This affects the class scope.

Class scope
Only code in App_Code is available to all classes in a site (that's where you HAVE to put shared code). In the WAP - because it's pre-compiled - all page classes live together in the same assembly and can thus see each other.

Perhaps the largest difference between the two is with the namespaces are constructed.
In a WAP all classes are created by default as members of the root namespace defined in the project (typically the project name). For example, in a project named "MyProject" the new page "MyPage" will have a fully qualified class name of "MyProject.MyPage". When you create sub directories in the project, visual studio creates another namespace level for pages created in those directories by default. So if I create a folder "Admin" and another page "MyPage" I will get a class name of "MyProject.Admin.MyPage".

In a site, all pages are part of the default root namespace for dynamically compiled pages: "ASP". Class names are created with underscore separation of their location when they live in sub directories. In a web site, instead of "MyProject.Admin.MyPage" the page class would actually be "Admin_MyPage". When it's dynamically compiled it will become "ASP.Admin_MyPage".

Which to chose
It is important to chose the right project type. With the changes introduced in Visual Studio 2005, it is now much easier to work with either type of project (no more IIS integration, woohoo!). Being able to open a web site via FTP is very helpful for certain needs. For some, the web site model will be ample. It's great for tests or simple sites that aren't code intensive.

However, I have found that in professional development the WAP is the better choice. Because there is a project file "controlling" the project it's easier to manage it with regards to what is actually included in the project which helps to control things such as the source control repository items for the project. In my case, having the project file is also necessary for the build system as the project file provides the parameters for what to build for a given project.

Yes, using a WAP forces us to always precompile the application. On the down side, this makes updates more difficult because any other changes are rolled in with it, we can't just update one single page. However, this is good in several ways.

Simply put, production code should not be updated willy-nilly. We need to exercise a fair amount of control over what gets pushed to production. The app should be regression tested by QA. Also, with a good build system and source control practices, you can do updates as necessary to deploy patches without including changes being made in a given applications main trunk. If you do need to make a change, there are ways to "patch" a single page by reverting it to the web site code file model.

Another benefit of using the WAP is that the project configuration is kept in the project file instead of in the web.config, where it really doesn't belong. This keeps the concerns (configuration of the actual app versus configuration of the project within Visual Studio) well separated.

Yet another good aspect to the WAP is that you can "see" all the classes in the project - they are all within the scope of the entire assembly. In some large projects with many developers and many pages that require query string arguments to function I've used a technique for doing "strongly typed page urls". Follow the link for more details, but in short: I create static page methods that return a properly formed URL. Using a managed method provides the opportunity to force required page parameters by using regular method arguments.

This is all obviously very biased towards using the WAP. This is partially due to where ASP.NET development started, in 1.1, with the web project. In the interest of full disclosure, I haven't worked with the web site model enough to really speak fairly for it. However, between the little I've worked with it and from what I've heard from speaking with other developers, for anything that isn't a trivial web site, the WAP is the way to go. The web site type is good in some cases, but as with any tool, it should be used where appropriate. Fortunately, Visual Studio has pretty good support for converting a web site to a web application project, so starting upgrading from a site is not terribly difficult.

Thursday, May 29, 2008

Mobilized Organization

LifeHack has a good article on staying organized using a mobile phone.

I have started use google calendar in this way a bit. I'm fortunate in that I have no commute to work and I'm generally either at work or at home so I'm hardly away from a computer. Plus, I generally don't have that much going on that I need to schedule.

Good article though.

Wednesday, May 21, 2008

Continous Monitoring & cool gizmos

I just listened to an interesting interview on Hanselminutes with Owen Rogers, one of the original developers of CCNet. They discussed continuous integration and continuous monitoring. Well worth a listen.

Get the show here

During the interview Scott and Owen mentioned a few technologies/products I hadn't heard of yet. One is Gumstix, which are super-micro Linux computers, literally the size of a stick of chewing gum. Another is Chumby a wifi-connected open Linux platform alarm clock on steroids. Some very cool stuff that I definitely need to learn more about (and of course will eventually succumb to purchasing).

Thursday, May 15, 2008

Progress - in bytes per second

Just over 10 years ago I had a good day connected to the internet. I was on a dial-up ISP getting a 57,600 bps connection speed. I was on for nearly 15 hours and received over 151 Megabytes! WOW! Before I disconnected I took a screen shot for posterity.

I recently downloaded some ISOs of Ubuntu Linux on my broadband cable connection. The download of about 524 Megabytes took maybe 10 minutes and maxed out at 1,027 KB/sec. Not to shabby for a sustained speed.

I just did a speed test and got an astounding 9151 kb/sec! If my math is correct, that's an increase of 9,313,024, or 16,168%!! I guess I shouldn't expect less from a 10 year gap. It's just too bad that I can't drive 161 times faster than I did in '98. (But I think gas prices are trying to keep up!)

I can't wait until fiber is competitively priced.

Ubuntu Adventures: WP/mysql/smbfs

Captains Log: 14 May 2008

Tonight I continued my experiments with Linux.

I managed to install mysql-server although I haven't yet gotten any databases set up yet. I also installed WordPress but also don't have that running yet.

The bigger achievement was getting the Linux box to see the file shares on my HP MediaVault NAS box. I found the HP instructions for doing this and had a go. I tried mounting it using NFS but it didn't seem to want to do anything. So I ended up installing Samba ("apt-get install smbfs"). Then I was able to mount using smb. After finding instructions on how to set up the credentials bit of it, I was able to configure the /etc/fstab file to provide automatic mounting of the NAS shares. Very cool.

In retrospect, I think NFS might work. I realized after smb failed the first time that the NetBios name of the box was resolving. I did a quick ping test, and saw a reply but I didn't even look at the reply details long enough to realize what happened. My ISP has recently started replying to all unresolved DNS names with some crummy parking page on their servers. It's screwed me up more than once. Some of my machine names don't resolve like they used to.

One of my secondary goals (if possible) is to be able to put the subversion repositories on the NAS box instead of on the linux server itself so I have some level of hardware redundancy (the NAS is set up with a mirrored volume set at the moment). I think that by default subversion uses Berkeley DB as the repository data store but you can change that to use just the file system. If BDB can't be used over a smbfs (which I suspect it can't) then I'll try a file system based repository which hopefully will work. If neither work, then I guess I'll just have to create a cron job (another learning curve) to regularly backup the repo data stores to the NAS.

One step at a time.

Ubuntu Adventures: The beginning

Captain's Log: 5 May 2008

Mission: Install linux (yet again) and find an actual use for it. I've installed it several times before, but after finishing, just stared at the login prompt and thought to myself, "Well, now what do I do with it?" This time, I have some goals in mind:
  • Convert my source control system to subversion (I've tried subversion on WAMP on Windows Server 2003, but it hasn't worked yet. Good excuse to upgrade to Linux.)
  • Move my blog from blogger to my own server using WordPress.
  • Be able to actually claim to have some minuscule clue about a non-Microsoft OS.

  • Old Compaq desktop
  • Intel Celeron 500MHz
  • 128MB RAM
  • 250GB HD

I have another box (Pentium 4 - 550MHz; 384MB RAM) that is currently running Windows Server. It is currently hosting FTP and my current source repository (SourceGear Vault). However, if I can find success with subversion on the Linux installation then I may decommission Windows and switch that box over to a fresh Linux set.

I installed Ubuntu Linux server. I attempted to install 8.04 first but it failed for unknown reasons. However, 6.10 (Edgy Eft) succeeded. During the installation I am pretty sure I selected the LAMP installation option. But to be honest I might have done it wrong - I did it during a fly-by while chasing down my 2 year old. Perhaps I didn't select the right option. Anyway, after installation I found that neither apache, mysql nor php were installed (at least I got the L part of it working).

After doing some searching I discovered the apt-get command. I ran it with some upgrade steps and it updated several packages and modules. I then used it to install apache and php. Later I tried "apt-get install subversion" and it worked. I'm starting to like this!

Once subversion was installed I created a repository, then started playing with TortoiseSVN from my Windows desktop to put in my whole source tree. I'll likely blow away the whole repository once I figure out what I'm doing but I'm making progress.

It's taking some time getting used to the different style of system administration. I'm so used to all the windows GUI tools for changing settings. However, I'm really liking the transparency and plain text methodologies of Linux.

Friday, May 09, 2008

Wait, Wait... Busted

My wife and I attended a live performance of the NPR program Wait Wait... Don't Tell Me. It was very entertaining. The panelists were Charlie Pierce, Amy Dickinson and Mo Rocca. The scheduled local guest was planned as Governor Elliot Spitzer, however, as host Peter Sagal said "Some came up. (Then the governor paid $4000 and it went down again.)" I'll leave it there. In former governor's place was a large arrangement of flowers.

To my delight, the replacement guest was television celebrigeek Adam Savage from Discovery channel's MythBusters! He connected in from a studio in California and was interviewed for a good 20 minutes. It would have been far cooler to have him there, but it was fun regardless.

Having worked in radio for many years I am always interested to watch radio show production. This performance was no exception. They have call in contestants, a few sound effects and quick thinking participants that the producers and engineers have to work around. After the show completed they spend 10 minutes and went back through the show doing some re-do takes where they needed to clean up introductions or whatever. Having a large (~2600 people) live audience complicates it a bit as well.

Overall, it was a good time and a chance to get out the house.

Saturday, May 03, 2008

Finding myself

I've struggled for many years finding and/or creating a digital identity for myself. I've never had a catchy screen name or hacker name or handle or whatever you want to call it. I'm not creative in that way. The creativity I do posses is with solving tangible problems in both tactile and abstract domains. I enjoy handyman type work, do woodworking as a hobby and of course, I work as a software developer so I'm constantly coming up with solutions to technology challenges. But I'm simply no good at coming up with things out of thin air. That's why I don't dance, draw or partake in other activities that I'd generally classify as visual art. It usually requires some form of inspiration from a greater force which I lack. My inspiration comes from the problems that need to be solved. (I suppose this is probably true for most technologists.)

So anyway, I've found it rather difficult to come up with a name to use for my internet presence or for this blog. Once, my sister said "You're the biggest geek dork I know." So, still lacking a name different than that given to me while still tethered to my mother's womb, I went and registered This certainly fits my general self classification as a geek and dork but it just feels a tad too sophomoric. I've tried a few names similar to those I've seen on other's blogs, but I hate the feeling of being a sheep just following the shepherds . But like I have stressed, I just don't have what it takes to make up something good.

Despite all this, for some reason, yesterday the phrase "compiled thoughts" popped into my head. It sounded like a good blog title and certainly reflects the whole notion of today's trend of aggregating ones mental randomness and uploading it to the likes of blogs, twitter, or what-have-you. I did some googling and found very little use of those words together outside of discussions on writing. I figured that the domain name "" must be taken, but to my surprise, it was not so I grabbed it.

I've pointed the domain at this blog for now and renamed it accordingly. However, I still don't have a "name" for myself, per se. At this point I guess I'll just keep using my real name, it's boring but easy to remember. At least now I have a title for the blog that I actually like. Plus, it sounds mildly intellectual.

Downloading from the series of tubes

Yesterday I was working on an automation process to deal with some vendor data. Unfortunately, the vendor doesn't have the data on an FTP location and names the files with dates so they change whenever updated. The files have to be downloaded manually from the vendor's web site after logging in. Not a process that's terribly easy to perform automatically.

One of my coworkers had already written the bulk of the screen scraping logic that logs in and looks at the download page for the links to the file names of the available downloads. This works great. He had put in the code to actually download the file using the HttpWebRequest, HttpWebResponse and byte stream classes. I commenced some testing and found that only a portion of the data was getting downloaded, leaving the file (a Zip in this case) corrupt. I googled a bit and found some articles with various suggestions on how to process the response stream from the web response class. It seems most people had problems with this seemingly simple task. Then I ran across a suggestion to use the WebClient.GetData() method. It was only about four lines of code.

As I pasted it into the program I decided to check out this class I had yet to use, WebClient. Low and behold, there was also a method called DownloadFile(). What started as a dozen lines of code for manipulating a byte stream that ultimately never even worked was now reduced to a single call:

new WebClient().DownloadFile(downloadFileUrl, downloadPath);

It's always a great feeling when you discover a class you didn't even know existed in the .NET framework that provides exactly what you are looking for. I'm happy to know that I don't need to become an expert at handling byte streams, instead I can focus on the business problem that I was trying to solve.

However, it leaves me to wonder endlessly about how many classes or methods I still don't know about that might allow me to reduce the code I write and solve problems in a much cleaning and robust way.

Thursday, May 01, 2008

Beware of non-specific references

I recently completed a change to a web application that utilizes the ASP.NET AJAX web extensions (System.Web.Extensions.dll). This assembly is loaded into the Global Assembly Cache (GAC) and referenced to there by the web project. I ran the web app locally without any issues. After updating the source code repository with my changes, I asked the build server to create a new release candidate of the app. This worked fine.

I then deployed it to the staging/test server and hit the URL. Failure! The error I received was

"Parser Error Message: The base class includes the field 'UpdatePanel1', but its type (System.Web.UI.UpdatePanel) is not compatible with the type of control (System.Web.UI.UpdatePanel)."

Clearly, a System.Web.UI.UpdatePanel is a System.Web.UI.UpdatePanel. So I investigated further. My web.config file contained this:

<compilation defaultLanguage="c#" debug="true">
<add assembly="System.Web.Extensions, Version=1.0.61025.0,
Culture=neutral, PublicKeyToken=31BF3856AD364E35"/>

This provides the assembly version the web app is loading for the creation of the dynamically compiled pages. Thus, the update panel created from the markup is the one from the version 1.0.61025.0 assembly.

I started looking at the web application assembly. Using BeyondCompare with a conversion rule to process .DLLs with ILDASM I was able to look at the compiled assembly references. I found my version of the assembly to have the reference as

.assembly extern System.Web.Extensions
.publickeytoken = (31 BF 38 56 AD 36 4E 35 ) // 1.8V.6N5
.ver 1:0:61025:0

while the build server's version had

.assembly extern System.Web.Extensions
.publickeytoken = (31 BF 38 56 AD 36 4E 35 ) // 1.8V.6N5
.ver 3:5:0:0

So, the code behind instance of the update panel in the prebuilt web app assembly is from the referenced assembly while the runtime instance from the markup is the other version. This was the culprit.

This happened because Visual Studio created the reference to the assembly that is in the GAC but created it with the "Specific Version" flag set to "false". When the project was built on the build server it took the newer assembly for the reference. I changed the flag to "true" in the project.

After committing the change I asked for another build. Now the build server built assembly had the correct version referenced and the app runs.