Tuesday, November 25, 2008

Caching of dynamic resources

It is a popular technique to store resources, such as CSS, XSLT, JavaScript and HTML files, in resources files embedded in an assembly. This approach makes solutions more self contained and makes localization easier using satellite assemblies. The processing of the resources is usually handled by a specific HTTP handler. Many times these resources are rather static and there is no need to reload them on each requests. For example a simple HTTP handler will look like this:


   1:  public class MyWebResourceHandler : System.Web.IHttpHandler
   2:  {
   3:        public bool IsReusable
   4:        {
   5:             get { return true; }
   6:        }
   7:        public void ProcessRequest(HttpContext context)
   8:        {
   9:             context.Response.ContentType = "contentType";
  10:             context.Response.Write("resourceContent"));
  11:             context.Response.End();
  12:        }
  13:  }


Recently I had to optimize the performance of such a HTTP Handler. It's purpose is to process a resource HTTP request, perform resource substitutions and other resource transformations and send the processed resource file back to the browser. Since this process was quite intense, it required regex and string operations, and the output rarely changed, it made sense to adjust the HTTP handler to analyze the cache information send by the browser and make the HTTP handler respond either with a complete response, sending the content of the resource, or with HTTP 304 - resource not changed.

To make this happen I added a check to determine whether the resources has been modified. This could be any condition you find appropriate. Based on this the handler either sends back the original content or responds with HTTP 304. To complete the response you have to specify the Content-Length HTTP header and set it to 0. To make sure that all caching information for this resource is correctly send to the client we have to specify the Cacheability, LastModified and Etag. All headers which control how the browser will request the cached resource at the next request.


   1:  public void ProcessRequest(HttpContext context)
   2:  {
   3:      if (IsFileModified(someParameters))
   4:      {
   5:          context.Response.ContentType = "ContentType";
   6:          context.Response.Write("resourceContent");
   7:      }
   8:      else
   9:      {
  10:          // File hasn't changed, so return HTTP 304 without retrieving the data 
  11:          context.Response.StatusCode = 304;
  12:          context.Response.StatusDescription = "Not Modified";
  14:          // Explicitly set the Content-Length header so the client doesn't wait for
  15:          //  content but keeps the connection open for other requests 
  16:          context.Response.AddHeader("Content-Length", "0");                   
  17:      }
  19:      // set cache info
  20:      context.Response.Cache.SetCacheability(HttpCacheability.Private);
  21:      context.Response.Cache.VaryByHeaders["If-Modified-Since"] = true;
  22:      context.Response.Cache.VaryByHeaders["If-None-Match"] = true;                  
  23:      context.Response.Cache.SetLastModified(creationDate);                
  24:      context.Response.Cache.SetETag(ETag);
  25:      context.Response.End();
  26:  }


Since resources are stored in an assembly, whenever the assembly changes we need to make sure that the resources get reloaded in the browser. Our file modification criteria will be the last modified date of the assembly, which contains the resources. To complete the handler we add two more methods IsFileModified and GetETag. IsFileModified checks if the ETag or the date modified have changed. GetETag generates a hash from the assembly's name and the last modified date.

The complete code of the handler can be found here: http://code.msdn.microsoft.com/ResourceCache

After applying these changes in our solution, we significantly reduced the traffic for downloading resources.

Now that we know how to deal with the cache I only need to find where the real cash is.



PS: I would like to give credit to Dan Larson, who uses the same approach, but in a different context in his book Developing Service-Oriented AJAX Applications. Great book on AJAX in ASP.NET 3.5!

Thursday, October 09, 2008

Encoding Optimizations for IE

Recently I had to tackle some performance issues in a set of AJAX web parts. We experienced unexpected slowdowns, which were increasing exponentially with the increase of the data being handled. After some investigation I found that the code is using the very popular base64 algorithm in JavaScript. This encoding is has the advantage that it has a counterpart in the ASP.NET server side and can be used to encode data send to and from the client when this is needed. It is a very popular algorithm and the JavaScript implementation can be found on many sites. Here is one of them: http://www.webtoolkit.info/javascript-base64.html.

It turns out that the string concatenations in this algorithm are killing IE and shooting the CPU utilization to 100%, locking up the UI. The situation progressively worsens as we go back to older browsers. In IE6 the performance is miserable. In IE7 it is better but still unacceptable. In IE8 BETA things looked much better, but still not in the range of performance in other browsers.

After poking around and getting some advice, Torben H. suggested that the string concatenation can be replaced by a simple buffer/array operation in JavaScript. This eliminates the recreation of the strings on each iteration of the encoding loop. In addition the performance of the modified algorithm increases in a linear progression when the amount of the encrypted characters grows. I also wrapped the algorithm in a separate JavaScript class, so it's use is a bit more intuitive.

You can see that on line 2, I create a buffer to hold the encoded string. On line 22 instead of using

output = output + this._keyStr.charAt(enc1) + this._keyStr.charAt(enc2) + this._keyStr.charAt(enc3) + this._keyStr.charAt(enc4);

I use buffer.push(...). Finally on line 25 the encoded string is returned as a join of the array. This is the complete code:


   1:  StringBase64Encoder.prototype.encode = function(to_encode) { 
   2:  var buffer = new Array(Math.round(to_encode.length / 3) + 1 * 4); 
   3:      var chr1, chr2, chr3; 
   4:      var enc1, enc2, enc3, enc4; 
   5:      var i = 0; 
   7:      do { 
   8:          chr1 = to_encode.charCodeAt(i++); 
   9:          chr2 = to_encode.charCodeAt(i++); 
  10:          chr3 = to_encode.charCodeAt(i++); 
  12:          enc1 = chr1 >> 2; 
  13:          enc2 = ((chr1 & 3) << 4) | (chr2 >> 4); 
  14:          enc3 = ((chr2 & 15) << 2) | (chr3 >> 6); 
  15:          enc4 = chr3 & 63; 
  17:          if (isNaN(chr2)) { 
  18:              enc3 = enc4 = 64; 
  19:          } else if (isNaN(chr3)) { 
  20:              enc4 = 64; 
  21:          } 
  22:          buffer.push((this.key.charAt(enc1) + this.key.charAt(enc2)) + (this.key.charAt(enc3) + this.key.charAt(enc4))); 
  24:      } while (i < to_encode.length); 
  25:      return buffer.join(""); 
  26:  } 
  28:  StringBase64Encoder.prototype.decode = function(to_decode) { 
  29:      var chr1, chr2, chr3; 
  30:      var enc1, enc2, enc3, enc4; 
  31:      var i = 0; 
  32:      var buffer = new Array(Math.round(to_decode.length / 4) * 3); 
  34:      // remove all characters that are not A-Z, a-z, 0-9, +, /, or = 
  35:      to_decode = to_decode.replace(/[^A-Za-z0-9\+\/\=]/g, ""); 
  37:      do { 
  38:          enc1 = this.key.indexOf(to_decode.charAt(i++)); 
  39:          enc2 = this.key.indexOf(to_decode.charAt(i++)); 
  40:          enc3 = this.key.indexOf(to_decode.charAt(i++)); 
  41:          enc4 = this.key.indexOf(to_decode.charAt(i++)); 
  43:          chr1 = (enc1 << 2) | (enc2 >> 4); 
  44:          chr2 = ((enc2 & 15) << 4) | (enc3 >> 2); 
  45:          chr3 = ((enc3 & 3) << 6) | enc4; 
  47:          buffer.push(String.fromCharCode(chr1)); 
  49:          if (enc3 != 64) { 
  50:              buffer.push(String.fromCharCode(chr2)); 
  51:          } 
  52:          if (enc4 != 64) { 
  53:              buffer.push(String.fromCharCode(chr3)); 
  54:          } 
  55:      } while (i < to_decode.length); 
  57:      return buffer.join(""); 
  58:  } 
The results show a significant improvement in performance in both IE6 and IE7.

Performance in IE6












original base64 (seconds)







modified base64 (seconds)















Performance in IE7












original base64 (seconds)







modified base64 (seconds)
















Just for fun I added an alternative encoding to the mix to see how it performs. This is the well know XOR algorithm, which is not as stable as the base64, because you can get unwanted characters in the encoded string, but it can be certainly used in AJAX applications. As you can see the performance is better for the XOR algorithm in IE7, but not really so much better in IE6. If your application is targeting older browsers there is no significant advantage in the performance of the XOR algorithm versus the base64. Here is the code I used for the XOR algorithm:


   1:  function StringXOREncoder(key) {
   2:      this.key = key;
   3:  }
   5:  StringXOREncoder.prototype.encode = function(to_encode) {
   6:      var buffer = new Array(to_encode.length);
   7:      var i = 0;
   8:      do {
   9:          buffer.push(String.fromCharCode(this.key ^ to_encode.charCodeAt(i++)));
  10:      } while (i < to_encode.length);
  11:      return escape(buffer.join(""));
  12:  }
  14:  StringXOREncoder.prototype.decode = function(to_decode) {
  15:      var rawinput = unescape(to_decode);
  16:      var buffer = new Array(rawinput.length);
  17:      var i = 0;
  18:      do {
  19:          buffer.push(String.fromCharCode(this.key ^ rawinput.charCodeAt(i++)));
  20:      } while (i < rawinput.length);
  21:      return buffer.join("");
  22:  }


Similar test in other browsers such as Firefox and Chrome did not have any issues with either the original or the modified algorithms. They handle JavaScript string operations much better. However most corporate environments are IE based and such optimizations may have dramatic impact on the performance of AJAX applications.

Happy encoding :)



Monday, October 06, 2008

Saving WebPart Personalization Properties

If you tried to modify some properties of your web parts and save them from server code you probably noticed that there are some differences in the way this is handled in the SharePoint web parts namespace and in the ASP.NET web part namespace. Particularly if you are trying to save web part properties from code that is outside of the actual web part.

In SharePoint land the SPWebPartManager class has the SaveChanges(webpart) method, which allows us to save the properties of any web part on the page from any code location. However in the ASP.NET namespace this method is not available. Certainly one option is to switch to the SharePoint web parts namespace, however this is not recommended and prevents the web parts from being used in pure ASP.NET environment.

The only viable alternative I was able to find out is to expose the protected WebPart.SetPersonalizationDirty() in all my custom web parts:

   1:  public class CustomWebPart:WebPart
   2:  {
   3:  ...
   4:          public void SaveProperties()
   5:          {
   6:              base.SetPersonalizationDirty();
   7:          }
   8:  ...
   9:  }

Then I can make changes to my web parts' properties and save them from other page events, such a Page_Load etc.

   1:  WebPartManager wpm = WebPartManager.GetCurrentWebPartManager(Page);
   3:  CustomWebPart wp = wpm.WebParts[0] as CustomWebPart;
   4:  wp.Title = "New name, new property";
   5:  wp.SaveProperties();

Seemingly easy, but for some reason took my colleague and me tons of times to figure out.


Monday, September 08, 2008

Free Windows Vista SP1 Installation Support

Couple of weeks ago I finally came about to upgrade my laptop to Vista SP1. Unfortunately the installation failed with one of these general errors, which indicate there is a corrupt file in my system. I tried to look up the error, but the proposed solutions were so wide in range that soon after I realized that I'll be loosing a lot of time fixing this - more than I wanted to spend. Some suggested it's time for a clean install, but some how this was not a good option for me. I would much rather stay without SP1 than reinstall tons of software.

Since in the past I made the mistake to install some early bits of Visual Studio and other products, I knew that I probably had leftovers from alphas and betas I did not need, but it was very time consuming to clean those. This is when I got a word that Microsoft provides free support for installation issues with SP1. So I initiated a support ticket using the online chat, and I was pleasantly surprised. I had to deal with several associates and the conversation kept going smoothly using the online chat tool, rather than phone, this I liked a lot. At some point the support rep asked me to take control over my desktop and ran some cleaning and analytical tools. Every time I had to discontinue the session I was called at the time requested by me to continue the work on the PC. At the end of the day the SP1 installed properly and my personal settings were preserved.

If you have issues installing Vista SP1 and don't have time to fix things the DIY way, try MS Support for free until March 18, 2009:


We all wish operating systems run without any flaws or issues, however every once in a while this is not meant to be. In these cases free support is a really handy option.



Thursday, July 17, 2008

Provisioning monthly folder structure in a SharePoint list

Recently I wrote an article about folder content types in SharePoint and how they can be leveraged to improve the document management in a SharePoint document library. The comments to the article show that there is definitely interest and potential in using folder content types, but they also point out some serious omissions. SharePoint MVP Ivan Wilson has taken things to the next level in a project he independently started on CodePlex. It is a console application that allows you to generate a folder structure within a document library. It will create a folder in the root of the document library using the current year-month (e.g. "2008-05"). 

Check out Ivan's post and the project's home page, this really makes the provisioning of the folders much easier.


Pre-order on Amazon: Developing Service-Oriented AJAX Applications on the Microsoft® Platform

For all the AJAX and ASP.NET fans out there I wanted to point out the upcoming book by Daniel Larson (SharePoint MVP). Daniel has an impressive knowledge about what's going on in .Net 3.5 and the coming SP1. The book will guide intermediate as well as advanced developers in the ASP.NET AJAX inroads. I already had a glimpse at some of the chapters and they are impressive and easy to comprehend.

The first to send proof of pre-order purchase, will receive a free book with Daniel's signature!

Developing Service-Oriented AJAX Applications on the Microsoft® Platform (PRO-Developer)
by Daniel Larson

The book is available for pre-order on Amazon.

Cannot wait to see this one in print!


Friday, May 23, 2008

]inbetween[ Huge Microsoft Community Summit 2008

This year Tech-Ed has two editions for IT Pros and developers. The weekend between the two events the Orlando Convention Center continues to be reserved by Microsoft. So what did the local DE's come up with? ]inbetween[ A huge community driven FREE event. Tons of speakers, user group leaders, MVPs, RDs etc. from Florida and beyond will have an enormous amount of sessions both Saturday and Sunday. Check it out and join the crowd!


Do you feel ]inbetween[?


Going to TechEd Dev

The first week of June I'll attend TechEd Dev. Monday I'll be on a Silverlight pre-conference session and the rest of the week at the actual conference. I noticed that this time around TechEd has tons of 300 and 400 session, which is really great.

It looks like every day after hours there will be a party going on. Tuesday SharePoint MVP Champions Andrew Connell and Bob Fox host SharePoint by Day, SharePint by Night @ BB Kings Orlando (9101 International Drive #2230; Orlando, FL). Wednesday I'll be at the house of Blues at the MVP Party. Thursday is the attendee party at Universals where I plan to be with Desi. If you happen to be at some of these events stop by and say Hi.

Since I'll be commuting with my colleagues every day if somebody is interested to carpool let me know. We did that at the last code camp in Orlando with some developers from the Space Coast and it worked pretty good.


It looks like a busy week.



Monday, May 12, 2008

SharePoint Folders Need More Love

Folder Content Types for IT Professionals

Published in the May 12, 2008 edition of To The SharePoint

Most of you SharePoint enthusiasts probably know quite a bit about content types in SharePoint. They provide the means to organize metadata in an extremely flexible manner and provide the context for workflows, custom menus, and document templates. However, due to the document centric nature of Microsoft Office and SharePoint, the most commonly used and discussed content types are the document content types. Well, there is another lesser-known character in the content type story of SharePoint--it is the folder content type.

One reason why the folder content type is less popular is that the default SharePoint installation comes with only one of them--the Folder content type. Compared to dozens of out-of-the-box document content types, the Folder is clearly outnumbered. So let's have a closer look at this lonely hero and create a couple of folder content types so that we can find out how to use them to further enhance the user experience and data management of a document library.

To put things into perspective, let's look at how the fictional environmental foundation Rain Forest can use folder content types to improve its excising document library. The foundation staff stores all documents in a document library and they already use several document content types to support their activities. The document types are separated into two functional groups:

  • Project Documents (Additional Fields: Due Date, Assigned To)
    • Application for Grant (Word document)
    • Financial Memorandum (Excel document)
    • Formal Acceptance Document (Word document)
  • Internal Documents (Additional Fields: Contact, Status)
    • Purchase Order (Word document)
    • Invoice (Excel document)

The document library looks very familiar, and all document types are listed in the New menu:


The IT team of Rain Forest also defined some views based on document content type to make the filtering of each document type group easier and to be able to display content type specific fields such as Due Date and Assigned To:


If you are not familiar with document content types, this article shows the basics. For more information on how to create views, check out the following article.

All documents are stored in the root and occasionally employees will create folders at their discretion. However with time the clutter of folders makes locating documents really hard. All users also notice considerable slow down in view performance. After a few months, the root folder contains more than 4000 documents and is expected to grow. What can be done? This is when the little known character from our SharePoint story--the folder content type--comes in to help.

One reason for the performance hit is that folders in SharePoint have some limitations by design. For details on how the number of items affects performance, check out this article. Nevertheless if we partition the documents by financial quarters or other perpetual attributes, we can keep the total number of documents in a given folder within the high performance zone. That’s why we decide to create a folder for internal documents and project documents using the respective folder content types for every quarter of the year.

Furthermore we can provide some structure and boundaries for the employees, so that they cannot create folders anywhere in the document library. To help users locate documents, we’ll use a great feature of SharePoint, which allows us to bind views to a specific folder content type. This will provide context for each folder, so that when a user enters a folder with internal documents, the view will automatically change to display relevant metadata.

First let’s create a folder content type for each of our document groups. The steps are no different than creating any other content type. The only difference is that our content type will inherit from the Folder content type.


You can add specific metadata to each of the newly created content types, but for this walk-through, we’ll use the existing columns.

Next, let’s add the folder content types to our document library:


In addition, we would like to remove the default Folder command in the New menu, so that only our custom folder options are available. To do that, open the advanced settings of the document library and disable the New Folder option.


After these changes we will add our two new folder entries to the New menu.


Now, when a user wants to create a new folder for the next quarter, he or she will select the appropriate folder types from the New menu, and give the folder a descriptive name such as Internal Q1 2008. The process of provisioning a new folder can be automated and extended by using calculated fields or other programming techniques.

To provide the appropriate views, we create one view for the root folder and separate, unique views for each folder content type. The root view will display only folders from the newly defined content types.



Each custom folder content type will have a view that displays metadata specific to the type of document contained in the folder. These folder views are marked as default but are assigned to the specific folder content type.



Let’s see the result by creating a folder of each type.


You’ll notice that if you click the folder Internal Q1 2008, the view automatically will change to the Internal Documents view; similarly opening the folder Project Q1 2008 will change the view to the view Project Documents.

To add additional context sensitive behavior, you can also limit the New menu items displayed for each individual folder, so that only Internal documents show in the New menu of the corresponding folder. From the drop down menu of each folder select Change New Button Order, and hide the appropriate document types.



When you enter the folder you’ll notice that only the contextually correct New menu items exist.


Similarly you can hide the documents from the new menu of the root folder. Open the document library settings and in the content type section click Change new button order and default content type. Hide all but the folder content types.


From now on, the dedicated volunteers of Rain Forest can rest assured that they can locate documents easily and that the performance of their document library is going to be stable. Furthermore, the IT Pro of the foundation has some great ideas about how to add custom menus for each folder content type, so that actions applicable to all documents in a folder can be executed faster and in the proper context. There are also many opportunities to use item event handlers and the SharePoint DOM and workflow with folder content types to further extend the application. This SharePoint story certainly does not end here.

Mikhail Dikov is a senior software engineer at Global 360 (www.global360.com) and MVP for Microsoft Office SharePoint Server with background in CMS and BPM software. Mikhail brings more than 8 years experience in Microsoft technologies such as .NET, ASP.NET and more than 12 years of IT experience. Current interests include BPM, BI, SharePoint, .Net and AJAX. Mikhail is frequent speaker at code camps in Florida and an active member of the Space Coast Dot Net User Group (www.scdnug.org). Email: mdikov at gmail dot com Blog: www.mikhaildikov.com

freemd.com = healthcare revolution?

Every once in a while there is a technology or a service that will change our perception about the way business is done in certain areas of life. One such fundamentally different approach to healthcare is presented by Dr. Stephen Schueler and his team at freemd.com.

Before I joined Global360 about three years ago, I was for almost 4 years with DSHI Systems (the makers of freemd.com) and this was one of the most interesting and dynamic jobs I ever had. This is where I learned a lot about innovation, persistence and attention to details. What's now offered as a free service, has been used and tested for many years in big call centers nationwide. It is an enormous gain for the general public to have this service freely available.

Numerous times I find myself going to freemd.com searching for information about family member's condition or simply for self education. However the most valuable feature the "virtual doctor" helped enormously in several critical occasions, where I had to make a choice whether to visit doctor, rush to the ER or simply stay at home, and this is huge!

Check it out www.freemd.com


Thursday, May 08, 2008

Software I use or not

Every once in a while I'd like to take some time and evaluate existing software that I use on a daily basis and also new programs that caught my attention recently. In this cycle I did some changes that made my life easier and I want to share them with you.

I picked several software packages ranging from VM software to HDD utilities and there were some real keepers. But before I go there let me say what I stopped using:

1. Google Docs - Very disappointing. I always strive to have some balance in my MS oriented professional life. I found that the integration between Google Docs and blogger.com would be really beneficial for me. The fact that I can work on blogs and organize my ideas from any location is also appealing. However two issues bothered me so much that I had to put an end to this. The first one was the broken integration between Google Docs and Blogger. Why on earth offer a cool feature and not make it work! For months I tried to live with the fact that the blog post title was not properly transferred to the feed thus making my blog entries look broken. Google's forums contained numerous complains about it, but never got fixed for more than a year.  I also found that making formatting work properly was always a game, since there was some strange transformation going on. Posting code snippets is always tricky, but man that was terrible experience. No more. I switched to Live Writer and I am supper happy. True I don't get to work on any PC, but for 95% of the time home and work are fine, and the formatting works great. Not to mention the plugins, which add tons of good features. Most notably the code snipped import.

2. Norton Anti Virus - After dealing with this software for many years and the lack of support and the constant upgrade issues, the hog it has become recently is unbearable.  My wife's PC literally stopped working after I installed the latest version. Needless to say I uninstalled it and while I am looking for an alternative I use AVG. I am looking for something light, effective and fast.

But I am really exited about the new software I got to use recently.

1. VMware Workstation- This is such a relieving change. As most SharePoint devs and presenters, I work in VM most of the time. Snapshots, memory management, USB support, networking options, all these features work so much better for me. The only annoyance I encountered is that after upgrades there is some nasty bug that happens every time. There is a fix posted on VMware's forum, but it would be best if it they work it out. It looks like an issue from couple of version behind.

2. One Note - This is a great tool to collect notes, snapshots and track ideas. Helps me immensely to jiggle multiple projects. The export feature makes it super easy to send out a package to somebody on the team.

3. Diskeeper Pro - A must for every PC, I've had older versions and this one seems to be doing a great job. I don't have much to say, because the thing just works it's magic in the background and rarely I have to do something with the UI.

4. Windows Live Writer - just what I needed for my blogging. Lightweight, simple control, extensible with tons of good plugins, formatting works.

Now back to business until the next cycle of software evals.



Sunday, May 04, 2008

Microsoft drops Yahoo bid - is this good for me?

Every once in a while we whiteness huge deals with the equivalent of tectonic movements to shake the industry. The Microsoft/Yahoo deal would've been the biggest financial transaction of the Internet era. This deal would've impacted not only the industry and all Yahoo employees, but also everybody in the Microsoft ecosystem. This ecosystem includes a huge number of partners, ISVs and developers outside of Microsoft. This ecosystem is where I leave, so naturally I asked myself "What is the impact of this deal/no deal on my life as a professional and as a consumer?"

My first reaction is that I feel really relieved that somebody wise stopped the ".Net  Bubble 2" from inflating at a higher rate. Let's face it - search and ads are big, but please, somebody has to keep things real and I am glad that Steve Ballmer did not cave in to the sheer racketeering that was going on. When irrational investors make mistakes, they loose their own money; when irrational executives make mistakes, they get the boot, and the way it looks Yahoo's executives will be in the hotspot pretty soon.

True, the giant from Redmond needs Yahoo to buy market share and expand at a faster rate in the search & ad market, but do they need them for their technology? Heck, even Yahoo is bailing out on its own technology and trying to outsource to Google vital services in which they invested billions of dollars. Considering that Microsoft also has a formidable technology stack in search technology, this is probably not a big selling point.

But is this really the way to beat Google? History shows that real winners emerge by opening completely new markets and expanding existing markets, rather than simply trying to conquer market share. The main reason is that market share with no innovation and authenticity to back it up is lost in a jiffy. Think of what happened to Lotus in the hands of IBM a decade ago and you'll see what I mean.

I'd rather see Microsoft taking a slower and agile approach, with smaller acquisitions, leading the way to reshaping their vision of personal computing and letting Yahoo to gradually withdraw in the Internet pantheon next to companies such as Netscape and the like. If this is what Microsoft's withdrawal from the Yahoo bid means, I applaud it!

Let's assume that the Yahoo deal is completely dead. Since Google did not deliver on the promise to create the next wave of Internet based office and personal productivity applications (c'mon how many years of fluffy Google Docs BETA!), it is about time for Microsoft to not only step up to the plate, but also to reshape our perception of personal computing. With so many devices in our hands and information that we need at our fingertips, both business people and consumers need a virtual desktop experience, which includes search of the web, but also of personal data and applications such as Excel and Word in the context of this connected network of devices. With their latest announcement of Live Mesh, Microsoft is doing exactly this. For me personally this is the huge deal. I am really tired of juggling personal and work computers and data on several devices and I can care less about ads. They are not interesting anymore. I learned to ignore them, the same way I fast forward the ads on my DVR. If Live Mesh is the answer of Microsoft to the challenges of the market, I hope they will  pull this off, and also figure out a way to open it to the community so that it becomes a truly open platform with support for third party devices and operating systems.


This Deal is dead! Long live innovation and authenticity...



Monday, April 21, 2008

MVP Summit 2008 simply a different type of conference

After a week in Seattle attending the 2008 MVP Summit I needed a break of couple of days before I can get get back to normal. The Summit is certainly a very different type of a conference. Here are couple of variations I would like to point out.

Regular conferences dive directly from day one into regular sessions. Slide decks, presentations, code samples etc. At the MVP Summit on the contrary /I can speak only for the SharePoint track/ we started with some rigorous physical, tactical and strategic training. The master of ceremonies <Lawrence/> got us on a bus and in the woods (short from traveling blindfolded) and whoever did not pass the test of running and shooting paint potions for hours could not go to the next level. The good news is that everybody was fit enough to make it :) . Due to strict NDA some call it "Paintball", but believe me it was more than that. Thinking about it, this will be the first conference ever, where I may lose weight rather than put some pounds. Here is me (right) with John Holliday and Kit Kai loading potions in special buckets:

Another huge difference is that MVP Summit has several keynotes. One to begin with and two at closing. One of the closing keynotes came from the Boss (Ray Ozzie) and the next one from the Big Boss (Steve Ballmer). Ray Ozzie spoke about the value of building and supporting product communities once a star product reaches critical mass. Steve Balmer electrified the audience with his energetic performance. More about what they said here.

The third difference is that what happens in between the agility training (aka Paintball) and the keynotes is very interesting and intensive, but also cannot be shared freely. The event itself was organized flawlessly and facilitated its purpose to get closer product teams and the community.

Talking about sharing, the best part of all is that I met a lot of talented and opinionated people with a mindset of sharing their knowledge with others. Some contribute primarily by answering questions in MSDN groups, others speaking at conferences and code camps, writing books or supporting community software projects.

Thanks to the organizers and the sponsors, who made this a truly remarkable experience.

Different conference indeed...


Saturday, March 22, 2008

Orlando Code Camp - another great community event

Three central Florida .net user groups pulled together an awesome event today. Great sessions, good and healthy food, excellent facilities, price tag -- zero!

I would like to thank all of the attendees that came to my sessions. Follow the link at the end of the post to download the presentation slide deck and the sample code.

Slide deck 1

Slide deck 2

Code Sample


Friday, March 21, 2008

Excellent introduction to Silverlight

This is an excellent book for developers starting to get more involved in Silverlight. I already had some exposure to Silverlight from presentations, demos and screen casts, and this book helped me get the bigger picture of Silverlight by describing from A-Z the essential components and techniques of this RIA platform. Even though it is dedicated to Silverlight 1.0 IMO about 80% of the book is applicable to the second beta of the platform. The XAML chapters worked for me on Silverlight 2 Beta and with some patience I was able to translate most of the JavaScript code samples to it's C# equivalent.

The book contains detailed information about all moving parts in Silverlight. Without being a boring reference book the content is presented in clean technical language with good examples. The code can be downloaded from the site of the publisher, but for the XAML part I preferred to type it myself using IntelliSense, so I can play with different options.

I particularly like how the author presents functionally similar components, such as transformations, brushes or animations, starting with the simplest variation and building up to the most complex. This not only shows the logical gradation in their functionality, but helps the developer to find optimal control for a given task.

The author frequently points out the difference between WPF, Silverlight 1 and 2, which helps to distinguish between seemingly similar features in all of these three presentation foundation flavors.

The color print was a pleasant surprise and certainly makes the content easier to comprehend. The color also helps to better demonstrate some of the more compelling visual effects in Silverlight.

Since the author, Adam Nathan, is a Microsoft developer on the Silverlight team, I really hope that he'll write a second edition of this book updated for Silverlight 2.


ISBN: 0672330075
ISBN-13: 9780672330070

Thursday, March 20, 2008


This Month: Intro to SharePoint Designer

When / Where?
Wednesday, April 2, 2007   - 6:30 PM EST

Orlando Public Schools Administrative Offices
445 West Amelia Street
Orlando, FL 32801 – 1129

How to sign up?

Who Should Attend?

Developers, designers, power users, architects,

What will be covered?
In this session we will dive into SharePoint Designer. 

Who will be speaking?
Scott Schwarze

Disappearing web.config entries

How many times have you experienced a chilling moment when something goes terribly wrong with the system you just touched and you don't have any clue what would've caused it?

In a SharePoint installation with multiple web applications and several custom solutions there may be a lot of action going on in web.config files. Even the slightest validation error in these files will bring the web application to a halt. This and the fact that the SPWebConfigModification class has a will on its own make the task of coordinating web.config modifications a very touchy business.

Recently one of my colleagues reported that after installing one of the SharePoint solutions, entries installed by another solution were disappearing, leaving the web application in chaos. Logically I started poking the features in the SharePoint solution, which was "causing" the issue, but this lead me to no where.  I only learned that when you call:


The web.config files for all web applications get rewritten, regardless of which web application is being updated. But this turned out to be a "feature" of SharePoint. Then I started investigating what other web.config modifications are being created by the rest of the solutions on this server. Luckily most of these belong to our company, so I was able to pull up the code. All features worked correctly when executed separately, but still in a particular sequence some of the web.config modifications were disappearing. And there it was ... one of the features was adding the modifications correctly:

SPWebConfigModification modification = new SPWebConfigModification();
modification.Path = "...some path..."
modification.Name = "Example"
modification.Value = value;
modification.Owner = "Owner"
modification.Sequence = 0;
modification.Type = SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode;

then applying the changes to update the web.config:


But there was no webApp.Update() to persist the changes in the SP database!

It turns out it is very easy to omit this part, because when you develop or debug such feature all will work fine until something does not flush the application pool thus disposing off the newly created SPWebConfigModification. The next solution or feature that calls ApplyWebConfigModifications will force reapply all modifications pulling them from the SharePoint database. For some features this actually might be a welcome side effect, but unless this is not the case you need to call webApp.Update() to permanently save the modifications to the SharePoint database.

One mystery solved. Next, please!


Orlando Code Camp - Sold Out!

Just noticed that the Orlando Code Camp is sold out. This is going to be another super-charged and totally free event organized by our friends at ONETUG.org. I signed up as a speaker with my two sessions from South Florida Code Camp. They were very well received in South Florida, so after some adjustments I decided to give them one more run. Come with your experience and ideas and lets talk about how we can avoid some common frustrations in SharePoint development.

I'll be carpooling with some Brevard developers, so if you need a ride or you want to save on gas contact me today or tomorrow to give you the details.

For details: http://www.orlandocodecamp.com/


Tuesday, March 04, 2008

Unable to add selected web part(s).

Every once in a while when you create a web part or upgrade web part, or do some of things developers do when developing web parts, there comes the chilling moment, when you see an error message such as the one below:

So the question is what do we do in this case? Before you hit the discussion boards, or worse, start pulling your hair, here are couple of tips you can use to troubleshoot the issue:

  1. Make sure the control is registered as safe in the web.config (duh...actually the error says what it means, right)
  2. Make sure the assembly is accessible and in the [port]bin folder. (obvious, but worth mentioning)
  3. Make sure the assembly name in the *.webpart definition file, matches the assembly name in the safe control element in web.config
  4. Make sure you don't have more than one *.webpart file for the same web part in the web part catalog. This may happen if you changed the name of the *.webpart file.
  5. Restart IIS to start clean. Attach the debugger to the w3wp.exe process and try to load the page with the rogue web part. This way you can determine the exact location of the assembly you are loading.
  6. Check if the web part class exists by opening the assembly with reflector. This might sound funny, but in a bigger team, when different versions of assemblies are flying around it is very easy to overlook something and to use the wrong version, which so happens does not contain the web part class at all.
  7. If you have other tips or suggestions, please add them as comments.

Phew, I think I dodged that one... It turned out I got an older version of the assembly and my web part class was not even there.


Unable to add selected webpart(s). A Web Part or Web Form Control on this page cannot be displayed or imported. The type could not be found or it is not registered as safe.

Monday, February 18, 2008

Create Shared Resource Assemblies in Visual Studio 2008

Couple of months ago in this post I described how to overcome the fact that resources in Visual Studio 2005 are compiled by default as internal. The need for public resource classes and shared resource assemblies is obvious in larger projects that span across multiple Visual Studio projects.

A new feature in Visual Studio 2008 allows developers to set the access modifier of the resource classes directly from the resource designer.

Another small but good reason to move forward to Visual Studio 2008.


Monday, February 04, 2008

South Florida Code Camp 2008 - Recap

What a great event that was! The FlaDotNet user groups put together their 4th code camp with absolute ease (or at least they made it look like this) and with lots of sessions to choose from.

For those who came to my sessions, one big "Thank You!". Please follow the link at the end of the post to download the presentation slide deck and the sample code.

After I finished with my "work", which was in the first two time slots, I stayed in the SharePoint track for a presentation from Michael Lotter - InfoPath 2007 and Visual Studio 2008. This was a very well prepared and presented session and it gave me a good understanding of the moving parts involved in an InfoPath based solution.

In the afternoon I ventured in a non SharePoint waters. First Mark Miller got me hooked on CodeRush and Refactor with High Speed Development in Visual Studio with CodeRush and Refactor. Then Bill Reiss did a great preview on Silverlight 2.0 with some cool videos. I finished with the passionate presentation by Larry Port on Continuous Integration with CruiseControl.Net and Nant.

The after-party was a hit. Lot's of good food, drinks and smart people to talk to.

So, here are the links:

Slide deck 1

Slide deck 2

Code Sample


Friday, January 25, 2008

South Florida Code Camp 2008

Yet another totally free event organized by developers for developers in South Florida will take place next Saturday, February 2nd, 2008. Dave Noderer and crew filled up the agenda with 72 sessions (That's right 72!) divided in 12 tracks. For full information go to http://codecamp08.fladotnet.com/.

On this code camp I have two sessions. One topic, split in two parts. The somewhat clunky name Utilizing Visual Studio 2008 capabilities for better SharePoint Development comes as a result my work on several projects in the last months and some of the exiting new features of Visual Studio 2008. I tried to find an answer to questions such as:

  • How to use Visual Studio Web Designer to create certain types of SharePoint UI elements?
  • How to structure my projects, so that I can easily test the components outside of SharePoint?
  • How to structure my projects and what community tools to use, so that I have to think less about the process of creating SharePoint solution files?

These are all big questions when it comes to the transformation of SharePoint to an actual development platform. To answer these an other challenges of Sharepoint development I am going to demonstrate how to integrate an existing information system with SharePoint without compromising quality or scalability. The four topics I am going to address are:

Part 1

  • SharePoint infrastructure, or how to reduce the time and maintenance of SharePoint specific deployment and plumbing.
  • UI design, or how to use Visual Studio 2008 web designer and new CSS features to easily create SharePoint layouts pages and web parts.


  • Testing SharePoint solutions, or how to take most out of the newly added testing capabilities in Visual Studio 2008 Professional.
  • ASP.NET AJAX Extensions in SharePoint, or how to use it and how to automate the configuration of this ASP.NET extension.

To spice things up Apress provided several copies of Workflow in the 2007 Microsoft Office System by David Mann.

So if you are in the area come and join the geek crowd. Here is the location:

Devry University
2300 SW 145th Avenue Miramar,

FL 33027


Thursday, January 17, 2008

Using ASP.NET development server for testing with WatiN

For many ASP.NET and Sharepoint developers WatiN (http://watin.sourceforge.net/ ) has become a frequently used tool to create functional tests and eliminate manual testing as much as possible. In my environment I use WatiN from within NUnit tests. To integrate all of these components in my Visual Studio setup, I also use http://www.testdriven.net/, a plug-in that supports most of the commonly used testing frameworks and among other things makes it possible to right-click within the code of the test and execute it with or without debugging.

This is all great, but how do I make sure that (1) the tests run independently and there are no leftovers from previous test sessions and (2) that the tests ca run with minimum configuration, preferably no configuration at all - simply build and run.

The ASP.NET development server (Cassini) comes to the rescue. This great application is always available and there is no configuration required to run any ASP.NET site at all. If only I can start Cassini in the beginning of each test run and then in my test open a new browser session for each individual test and close it after the test is completed!

Jesus Jimenez has a great article on Code Project (http://www.codeproject.com/KB/aspnet/WatiN.aspx), but the way he structured the SetUp() starts and stops the dev server for each test. Since I prefer to have more small tests rather than only couple of big, stopping and starting the ASP server seems unnecessary and not efficient.

The key here is that in the test run setup [SetUp] I prefer to detect if Cassini runs on the predefined port used for testing. If it is, we just keep going. If it is not we start a new instance of the development web server.

Other than that the test structure remains the same. For each test we open a new IE session and close it in the test tear down. Here is some code:

public class TestTemplate
private const string devServerPort = "12345";
private const string homePage = "default.aspx";
private IE ie;
private string rootUrl;
private string homePageUrl;

public void SetUp()

bool IsWebStarted;
rootUrl = string.Format("http://localhost:{0}", devServerPort);
homePageUrl = string.Format("{0}/{1}", rootUrl, homePage);
// Check if Dev WebServer runs
ie = new IE(rootUrl);
IsWebStarted = ie.ContainsText("Directory Listing -- /");
IsWebStarted = false;

if (!IsWebStarted)
// If not start it
string command = Path.Combine(

string rootPath = Environment.CurrentDirectory.Substring(0, Environment.CurrentDirectory.LastIndexOf('\'));
string commandArgs = string.Format(" /path:"{0}" /port:{1} /vapth:/", rootPath, devServerPort);

Process cmdProcess = new Process();
cmdProcess.StartInfo.Arguments = commandArgs.ToString();
cmdProcess.StartInfo.CreateNoWindow = true;
cmdProcess.StartInfo.FileName = command;
cmdProcess.StartInfo.UseShellExecute = false;
cmdProcess.StartInfo.WorkingDirectory = command.Substring(0, command.LastIndexOf('\'));

// .. and try one more time to see if the server is up
Assert.IsTrue(ie.ContainsText("Directory Listing -- /"));

// Give some time to crank up


public void TearDown()

public void TestInstantiate()
// insert test logic here


This shaves some time from the testing process, which for longer test sequences might be a significant gain.


Joe Healy and Visual Studio 2008 come to Melbourne

If you are a software developer or IT Pro don't miss the opportunity to learn about the latest improvements in Visual Studio 2008 from the source. Joe Healy is the regional Microsoft Evangelist developer and editor of the regional MSDN newsletter.

When: January, 22nd @ 6:30PM

Where: Charlie and Jake's on Whickham Road in Melbourne

To register visit http://scdnug.org/default.aspx.

If you are interested to put the announcement on a bulletin board in your company, university or college you can download the form from here http://scdnug.org/files/folders/flyers/entry205.aspx

We'll be raffling couple of new books and you can also get your copy of Code magazine while they last.

See y'all next Tuesday.


Wednesday, January 09, 2008

Impressions from Visual Studio 2008

Last week I was able to finally use VS 2008 in a real project. I was working on some web parts and ASP.NET controls and it is great to be able to use some of the improvements in the web development environment.

#1 on my list is the CSS management and the improved design time support. Now we have the split view, so you can actually see code and design at the same time. The design view also has a notably faster rendering speed, which is just great.

On the CSS side...How many times have you tried to create a fully CSS compliant page using style sheet for positioning instead of tables in VS 2005? It was a real pain. The editor would not update the style for the object, but insert a style attribute defying the whole purpose of using CSS. Positioning objects using relative or absolute positioning required a lot of switching and tweaking.

Not anymore! Once you assign the class or the CssClass attribute respectively to a HTML or ASP.NET control the editor will automatically apply style changes even if they are in a separate file. It is also very easy to apply multiple styles or track down the order, in which they are applied.

Now this is an environment that really encourages developers to finally drop the archaic table positioning and easily create CSS compliant controls and pages.


Thursday, January 03, 2008

Microsoft MVP Award

A day ago I received an E-mail from Microsoft that I have received the 2008 MVP Award for Microsoft Office SharePoint Server. What a great start of the new year! This award is a great recognition for my SharePoint sessions on several code camps in Florida, my blog and other work I have done in the developer community. It is also a great motivation for what I have in mind for the upcoming 2008.

I would like to thank Ken Tucker, Joe Healy and the Florida developer community for organizing many great events and ultimately giving an opportunity to local developers like me to grow professionally and have fun doing it.

In the next couple of months I plan to build upon my experience in 2007 and attend several code camps as attendee and speaker. In addition I'll be looking into opportunities to write more extensively as a book reviewer, technical editor or author. Apress and other publishers have great user group programs, which I hope will bring me closer to this goal. I also plan to work on the much needed makeover of my site. I was postponing this for a while, but finally I am going to take Verio's generous web hosting offer for Microsoft developers and create a site that better meets the needs of my work.

Looks like a lot of fun, doesn't it!

For more information about the MVP Award visit: http://mvp.support.microsoft.com/gp/mvpintro