Archive for the ‘Performance’ Category

Android Best Practices Book

As of this week I am a published technical author.

As a result of my speaking at Michigan Google Developers DevFest in April of 2013, I met Godfrey Nolan, who was also speaking at the event. In June he invited me to contribute to his upcoming book. The result is Android Best Practices, published by Apress. I contributed the chapter on Web Services. I am honored that Godfrey chose me, and I am pleased with the result. I got my physical copy of the book yesterday, it was published officially on the 14th.

Living in a Mobile Bubble

I’ve been so entrenched in learning the mobile paradigms over the past few months, that I suddenly realized I’m living in a mobile bubble. I have come to the stark realization that most companies don’t get mobile. I don’t think it’s that hard or a long way to go, but I can see it’s not happening yet.

Lunch with my wife yesterday was a perfect example. It was a sub shop (that I love), and they had advertising cubes on the table. I picked one up, and was briefly excited to find a QR Code on the bottom. I immediately thought, “Wow, these guys are addressing mobile!”, until I actually scanned the code. The URL was a link straight to their home page. OK. I followed the link on my phone. The page opens and the main content is one single graphic, but at least the navigation and header are not. If you have a smart phone, you would know that you can’t scroll the page easily by swiping a big graphic, especially when the page/graphic is loading. So to scroll I had to carefully swipe the obligatory right “social media” box to scroll the page to find out there was nothing below the huge picture. The whole experience was disappointing to me. So lets enumerate the mistakes here:

  1. The QR Code
    1. It contained only the link to the home page. Huge opportunity lost. How are they tracking users coming from the code? They are not. No query string parameters, no redirection/click-through.
    2. No reward for scanning the code. Does this company use email promotions? Why yes they do. Did they give me some incentive to scan any of their QR codes again later? Nope.
  2. The web site
    1. The site is not mobile friendly, yet they made an effort to get me there on a mobile device.
    2. The navigation menus on the web site rely on hover. There is no hover in mobile.
    3. Big graphics load really slowly on mobile networks. Abandon the 1990’s, come to the present. One big picture might look good, but it’s not functional. If pretty pictures sell your food, why bother with mobile devices and QR codes?
    4. Of course the site isn’t formatted for mobile devices. It’s a fallacy to think that because it renders OK on the newer mobile browsers, that forcing the user to pinch-zoom to read anything is acceptable.

This brings me back to my bubble. I probably have unrealistic expectations of how companies are using or embracing mobile. The information on how to do this well is out there. Why aren’t companies finding it? Is it that the technical audience knows these things, but the marketing audience does not? Or is the mobile customer base really that small, and the companies just don’t care at this point? I think in this case they must care if they made the effort to put QR codes on their tables. So maybe I’m in a bubble, expecting these things because it’s what I’ve been immersed in for the past few months. Maybe normal mobile users don’t have that expectation. Maybe next year companies will get mobile.

Creating an Extraction Rule for VSTS 2008 Web Tests

Extraction rules are essentially for finding data in the HTTP response and placing it in the output context of the web test. There are a few built-in tests, but they mostly focus on the HTML tags themselves and the attributes. In my case I really needed the data between span tags. I think this could probably be done with the existing rules and some regular expressions, but I couldn’t resist the chance to write some code and learn something new.

All you need to do is place the class file in the Test Project and compile it. The rule automatically becomes available to the tests in the project.  Here is my class that finds a span for a given ClientId. It overrides the Execute method of the ExtractionRule base class and attempts to find a span for the given ID. If the span is found, it parses the HTML string to find the content of the span tag.

namespace WebTestDemo


    [System.ComponentModel.DisplayName(“Span Extractor”)]

    public class ExtractSpan : ExtractionRule


        // The name of the desired input field

        private string nameValue;

        public string ClientId


            get { return nameValue; }

            set { nameValue = value; }



        public override void Extract(object sender, ExtractionEventArgs e)


            string[] tagTypeFilter = new string[] { “span” };


            //Fail the test if nothing is found (this may need to be modified)

            e.Success = false;


            if (e.Response.HtmlDocument != null && e.Response.IsHtml)




                    //Find the span tag based on ID. Exception if none found

                    HtmlTag result = e.Response.HtmlDocument.GetFilteredHtmlTags(tagTypeFilter).First(t => string.Equals(t.GetAttributeValueAsString(“ID”), this.nameValue, StringComparison.OrdinalIgnoreCase));


                    //The span was found (no exception), now find the data


                    //Get the location of the ID in the span tag

                    int startPosition = e.Response.BodyString.IndexOf(this.nameValue);


                    //Find the position of the data immediately following the closing angle bracket of the span,

                    //  accounting for the > character as well

                    startPosition = e.Response.BodyString.IndexOf(“>”, startPosition) + 1;


                    //Get the position of the closing tag for the span

                    int endPosition = e.Response.BodyString.IndexOf(“</span>”, startPosition);


                    //Fetch the content

                    string content = e.Response.BodyString.Substring(startPosition, endPosition – startPosition);


                    //Add the value to the context output. This could just as easily go to a file or a DB

                    // this step is not necessary for the extraction to succeed

                    e.WebTest.Context.Add(this.ContextParameterName, content);


                    //Mark the extraction as successful

                    e.Success = true;


                catch (Exception)


                    e.Success = false;

                    e.WebTest.Context.Add(this.ContextParameterName, string.Format(“span tag id={0} not found”, this.nameValue));






Now that I have the class, I need to wire it up to a URL in a web test. Right-click the URL and choose Add Extraction Rule…


I need to set two properties:

  1. Context Parameter Name. This comes from the base ExtractionRule class, and is the name for the data that ends up in the output.
  2. ClientId. This is a custom property from my class. It is the ClientId of the rendered control in the HTML output. The class finds the span with this name and returns the data.

Now when I run the test, if the ClientId I specified was found, it shows up in the Context output after running the test. The Context Parameter Name was “SpanData” in this case:


This could be made more robust by not coding specifically for spans. Certainly there could be issues if the tag ID is used more than once or if there is significant nesting of spans within the tag you are trying to find. This code is intended to prove out the concept, it could certainly be made stronger.

One thing I want to mention is that all the code samples I have run across (MSDN included) show the RuleName property as the way to display the extraction rule name in the Visual Studio UI. But under compilation this property comes up as obsolete. I found the answer on Ed Glas’s blog. The obsolete message mentioned using attributes, but this info was not discoverable, so I was quite grateful for that posting to get the syntax correct.

Helpful Links

Must Read VSTS – Testing Related Blogs and Introductory Articles

How to: Create a Custom Extraction Rule

Custom Extraction Rule and Generating a Code Test from VSTS

Useful LogParser Queries

If you deal with production servers and you don’t use LogParser, you should. It gives SQL-like abilities to query web server logs an other log types (like Event Logs). Coding Horror has a great article about the tool as well, and a link to a good article about forensic log parsing. Here are my most used web queries:

Find Pages with 500 errors
logparser “SELECT cs-uri-stem as Url, sc-status as code, COUNT(cs-uri-stem) AS Hits FROM C:\ProdLogFiles\ex*.log WHERE (sc-status >= 500) GROUP BY cs-uri-stem, code ORDER BY Hits DESC” -o:CSV >> C:\Data\ErrorPages.csv

Find 404 Requests
logparser “SELECT cs-uri-stem as Url, sc-status as code, COUNT(cs-uri-stem) AS Hits FROM C:\ProdLogFiles\ex*.log WHERE (sc-status = 404) GROUP BY cs-uri-stem, code ORDER BY Hits DESC” -o:CSV >> C:\Data\404Pages.csv

Find the Slowest Pages
logparser “SELECT TOP 100 cs-uri-stem AS Url, MIN(time-taken) as [Min], AVG(time-taken) AS [Avg], max(time-taken) AS [Max], count(time-taken) AS Hits FROM C:\ProdLogFiles\ex*.log GROUP BY Url ORDER BY [Avg] DESC” -o:CSV >> C:\Data\SlowPages.csv

DataSets and Calculated Columns

Ran into a performance issue in a .Net remoting situation. A Winforms app is calling an application server asking for data. A relatively large DataSet (>10,000 rows, <6 columns) being passed over the wire was causing a performance problem. The database and application servers processed it quickly. Examining the transfer with WireShark showed that the transfer wasn’t so bad either. There was a flurry of data passed, and then a bunch of waiting on the client-side, with the client CPU usage around 50% the entire duration of the wait. Turns out there is a calculated column in one of the data tables. The column is not calculated on the application server-side, so as not to pass a bunch of data across the wire that would be unnecessary. The calc happens on the client. That was the source of the slowdown and CPU usage. In the end the solution to the problem was not using the calculated column, we found a different solution to fix the business problem. I suppose you could perform the calculation in the SQL statement that was ultimately filling the DataSet. That might take longer to transfer, but won’t slow down the client app.

Caching Images in IIS 6.0

Yahoo posted a list of rules for improving the performance of your web site, along with a new FireFox-based tool for diagnosing your site’s performance, called YSlow.

Their number one rule is to reduce the number of HTTP requests, and this only makes sense. I’ll bet most of us ASP.NET developers are well aware of output caching, and how to do this in code. But what about those static files, like images and scripts? Well, there is an IIS setting for that. It’s easy to do, and the payoff can be big if you have a very graphic-intense site. Here’s what you do for IIS 6:

  1. Open the IIS Management console
  2. Find the directory containing your images (static content only)
  3. Right click the directory, and choose Properties.
  4. Click the HTTP Headers tab.
  5. Check the Enable Content Expiration check box.
  6. Click the Expire After radio button, and choose an interval.
  7. Click the OK button. Done!

The downside is that you won’t get the payoff for the first time a user visits the site, but other pages using the same resources will be much snappier. Be aware that caching dynamically created content this way can cause some strange issues, so take care as to what you cache. As always, test it well before you release it and you will be rewarded.

Posted on:
Posted in IIS, Performance | Comments Off on Caching Images in IIS 6.0

Read and Learn

An awesome post by Jessica Fosler about finding memory leaks.

Posted on:
Posted in Performance | Comments Off on Read and Learn

List Databinding Performance With DisplayMembers and ValueMembers

My copy of MSDN magazine arrived last night, and I read the article Practical Tips For Boosting The Performance Of Windows Forms Apps. Good read. Anyway, I was shocked to find out that I have been databinding lists improperly ever since I have been using .Net. I frequently wrote my code like this:

//Bad Code
combobox.DataSource = datatable;
combobox.DisplayMember = “State”;
combobox.ValueMember = “Id”;

//Good Code
combobox.DisplayMember = “State”;
combobox.ValueMember = “Id”;
combobox.DataSource = datatable;

Apparenly, order matters very much. In the first example, the combobox binds using the DisplayMember, then rebinds when updated with the ValueMember. In the second example, the binding only happens once.

In our current app, we have two lists that contain thousands of items that need to be bound, so we are binding them during the startup process so the user won’t wait when requesting that data. The startup time was reduced by just under 40% by changing the order of the code for binding.

Posted on:
Posted in .Net 2.0, Performance, Winforms | Comments Off on List Databinding Performance With DisplayMembers and ValueMembers