Basic SEO tips for ASP.NET website


SEO has become the most important thing for any website nowadays and as ASP.NET is growing, SEO has become a hot topic among ASP.NET developers.
The world of Search Engine Optimization (SEO) keeps on changing continuously, we can say this as for example can you imagine that last year Google published 450 updates to it’s search engine algorithms?
Though the SEO trend is continuously changing still there are some key points that have been fundamental and unchanged so far.
Here i have mentioned few points which should be kept in mind to implement proper SEO in your ASP.NET site:-

1. Page titles

One of the most important things are page titles (between tags). When somebody is searching something then these titles are shown as links in search results. One of the common mistakes is using same title for all pages. Imagine how worthless are the following search results for user.

Hello World
Some fragments from page that may not be useful for user.

Hello World
Some other fragments from page as description.

Hello World
Almost no content, let’s show what we have. Date Modified 01/19/2008 Titles by Marc and Lewis Custom services.

Hello World
Some fragments from page that may not be useful for user.

Hello World
Some fragments from page that may not be useful for user.

When using page titles carefully you give searchers a reason to visit your site – if your site offers something they are looking for. They can easily see from results if this page may offer something they are interested in.

Adding title to your pages is not something complex. If you built CMS then you have titles anyway inserted for each page. If you built product catalog then use product names as page titles. If you think further you discover that you have more data already there in your databases you can use in page titles.

2. Use meaningful URLs

“Nice” URLs is the other important point. Instead of using long URLs that contain many query parameters, you should use URLs that are formatted as URLs of static pages. Some experts say that it is enough, I don’t agree – it is not enough. Look at the following two URLs.

First one looks like some alian language. The other one is better at one point – it is shown correctly by almost every mail client in this world (except the buggy ones of course). But for visitor this URL doesn’t have any value. Visitor may get some idea that he or she is looking at content and then there is hundred. What is this hundred stand for? Dunno.

Now let’s go one step further and let’s use meaningful URLs instead of nice ones. Meaningful URL is nice URL that has some meaning for visitor. I mean if I give you meaningful URL you can figure out what you may find on page behind this URL. Here is the meaningful version of previous URL.

Now you have URL that tells you what you can find when you follow it. In this case it hopefully opens the page that introduces HP LaserJet 1200.

You can use components like UrlRewritingNet. UrlRewrite for nice URLs. Also be noticed that IIS7 has URL mapping support that provides you with mod_rewrite-like features (mod_rewrite is very popular Apache HTTP Server module).

3. Tell about the structure of your content

Every page has some structure. Title, maybe teaser, paragraphs, headings and so on. Maybe some citations and quotes and why not some important points you want to emphasize. Most of robots are not very strong on analyzing CSS. To make your page semantically correct follow these steps.

  • Use headings to divide longer stories to parts that make sense to readers. You can use
    < h1 >…< /h1 >  tags by example. These tags are made for this.
  • If you need to emphasize some sentence of your text, put it between, or tags.
  • Use for citations, for quotes and so on. Let robots know what parts your pages contains.

This way you also make your pages easier to read to your visitors. I am sure they are happy if you provide correctly structured content that is easy to read.

You can make your users life easy – use WYSIWYG editors like FCKEditor, so your users can create structured content by theirselves. FCK has also support for ASP.NET and offers ASP.NET control you can use on your pages for free. Integrating these editors to your site is nothing complex.

4. Test your site on extreme conditions

What happens when your site is slow and requests often timeout? Well, some of these requests are made by robots and if they continuously cannot connect your site they will dsrop it form their indeces. Make sure your site responds acceptably fast to requests also under heavy load.

As a visitors we don’t like slow sites. We want pages to open fast and when page is opening every second is like decade for us.

You can use tools to make automatic stress tests to your site. Just record your paths, configure flood and run the tests. When you use performance counters you may find out almost all weak parts of your site before it goes to production.

5. Test your AJAX site in terms of SEO

When using AJAX in your site then you are usually in hard trouble – search engine spiders will see only few parts your site because they doesn’t run JavaScript. Yes, they can analyze it but they doesn’t run it. So, most content of AJAX intensive sites is invisible for robots and it will never get indexed.

To make your AJAX site search engine friendly try to avoid initial content loading over JavaScript. At least for these parts of your page that you want to be indexed. Also you have to make it easy to robots to navigate through your site.

If you want to see how robots will see your AJAX site then use a simple trick. Turn off JavaScript support from your browser and try to visit your AJAX site. Now you will see what really gets indexed when robots are indexing your site.

6. Clean up your source code

Something hard to believe but still true: clean up your source code and try to minimize the code as much as possible. Follow these rules:

  • don’t use inline CSS, use external stylesheets whenever possible
  • don’t use inline JavaScript, use external .js files instead
  • don’t leave HTML-comments
  • don’t use massive line-breaking (twenty lines with only a linebreak or something similar)
  • don’t use viewstate when not necessary
  • don’t use a < form runat=”server> when not necessary (comes with hidden fields)

The better the relation between the content (==text) and the (HTML/CSS/JavaScript) code, the better your ranking will be. The smaller the source code, the better this relation will be.

7. Make your site crawlable


  • don’t use Flash/Silverlight to show information
  • don’t use Flash/Silverlight for menus
  • don’t use JavaScript-based menus
  • don’t use button-based menus
  • don’t use intro-pages


8. Validate your source code

Use the wonderful HTML-validator at to validate your site. If it’s valid, there will be no punishment for a “bad technical solution” – use the validator to find problems, get rid of them, and find a better ranking instead.

9. Get to know your users

I don’t know about you, but I’m building websites for users in the first place. So my intention is to get to know how I can achieve more interests for my site. The first thing is to find my audience – the better my ranking is, the more visitors I will have. So I’m using keyword-tools to find good keywords and keyword-combinations to find the keywords I need to optimize for (because that’s what the keywords’ potential visitors will use):

Google external Keyword-Tool:
Find the good keywords, look at your competitors, and start optimizing.

10. Keyword density

Keyword density is very important for a site’s ranking for a special keyword. Imagine you have a site about “free source code” – well, you should defiantly try to mention this keyword combination as often as you can – as long as it makes sense, of course. Try to reach a keyword density bigger than 3.5 – so mention “free source code” 3.5 times per 100 words.