Why my SEO campaign failed? Part 2: Common Web Development mistakes

mistakesIn a previous article we focused on the Common On-page Optimization mistakes that affect the SEO campaign. In this blog post we will focus on some technical principles that web developers should have in mind when they develop SEO-friendly static or dynamic websites.

Here are some common mistakes that can put at risk your SEO campaign:

1) Pass 1 million parameters in the URL

When the website is dynamic, web developers need to have a reference about which page, product or category the visitor wants to see. Usually an id is needed in order to retrieve the data from the database. In other cases due to the complexity of the project or due to bad programming techniques, more variables are needed in order to identify a particular page. Here is an example of a dynamic URL:


Unfortunately such URLs are not considered friendly neither for users nor for search engines:

“If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.” – Google Webmaster guidelines

So it is highly suggested to use URL rewrite software such as mod_rewrite in order to convert your dynamic URLs into Search Engine Friendly. What’s the risk if you don’t use it? Well if you have too many parameters in your URLs the search engines might not index those particular pages. Additionally if none of those URLs contain important keywords in the path, you might rank lower than similar pages with optimized paths.

Finally have in mind that sites with such link structure usually face more duplicate content issues.

2) Use tons of Javascript, Frames/iframes, AJAX, Flash and Silverlight

All the above are useful tools and some of them enhance the user experience. But none of them are search engine friendly.

“Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.” – Google Webmaster guidelines

As you see, even though from time to time the official Google blog has informed us that they do execute Javascript and try to index Ajax, they try to use Forms to find more quality content from a website and they do their best to improve Flash indexing, they strongly recommend in their guidelines that if we want our sites to be search engine friendly we should check that they can be seen by using simple text browsers like Lynx.

Thus make sure that you use the above technologies only in parts of the page that you don’t care whether they are going to be indexed.

3) Don’t use image alts & don’t optimize the image paths

Search Engines and computers in general, are not very good in identifying what is represented in an image. So in order to understand what a picture is all about, they analyze their near by text, their filename and their alt attribute.

Google Image Search can bring you a significant amount of traffic. If you don’t ensure that you support mechanisms that permit image optimization, you might loose a good source of traffic. So ensure that you use correctly both the alts and the image paths and build a CMS that permits their optimization.

Matt Cutts explains the importance of these in the following videos so make sure you watch them:



4) Use incorrect methods to support different languages

When you have multilanguage sites make sure you have the correct architecture. There is not a unique accepted way to do it. Mainly there are 3 correct ways to support a multilanguage site and each of the following methods has some pros and cons:

i) Use sub domains. Example: fr.example.com, gr.example.com, etc

ii) Use subfolders. Example: www.example.com/fr/, www.example.com/gr/, etc

iii) Use different TLD domains. Example: www.example.fr, www.example.gr, etc

Our tip is to avoid serving content based on IP without having the same version of the site available to everyone. Also do not implement bad solutions such as passing the language as a HTTP Get variable, for example: www.example.com/?lang=fr

5) Don’t care about the loading speed

Recently Google informed us that they use website speed in their ranking algorithms as one of their signals. There is a debate on whether this feature should be included since it has nothing to do with the relevance of a website. Nevertheless speed increases the User experience. Additionally having a light website that loads fast, can improve your ROI and lower your server/internet line needs leading to reduced costs.

Google suggests a bunch of useful tools that can help you speed up your site by reducing the size of your CSS, JS and HTML files. Make sure you check them and use them during development.

Note the fellow Software Geeks:

As you know loading time is also affected by both the server and by the code that is executed. Thus make sure that you optimize your code before you launch a site. This means that you need to close the DB connections whenever you don’t need them, reduce the web requests, and optimize your code and your Database. Finally make sure you fine-tune the web server that hosts your website.

8 Tips for web development

Here are some tips that we can help you during web development:

    i) Place menus, headers and footers in separate files in order to perform global fast changes (use include() in PHP or Master Pages in ASP.net)

    ii) Try using subdomains to support multiple languages. We will cover this topic in a different article.

    iii) Use absolute paths for every page, image, css or js file that you have in the site.

    iv) Use canonicals to avoid duplicates.

    v) Use webseoanalyticsLinker.zip to avoid PageRank losses by the use of nofollow. For more info read PageRank Sculpting: You can still use rel=nofollow

    vi) Use 301 redirection instead of 302 when the path of a page has changed and never delete pages.

    vii) Make sure you redirect the non www to www version or vice versa.

    viii) Don’t add session Ids in the URLs.

That was the 2nd part of the “Why my SEO campaign failed?” guide. Feel free to add you comments and propose your ideas about the subject. The best suggestions will be added in the list.

Stay tuned, because in the next article we will focus on the Common Link Structure mistakes.

Image by jeannab


Comments are closed.