Oct
29
2008
0

Choosing an appropriate SEO web site design company

Whenever a client approaches us about optimising their site and they say “we have had xyz.com optimise it but we still don’t have placement” I always have a look at xyz.com’s website to see if they have optimised their own site.

Despite what their site may say for human readers, the fancy graphics, Flash animations and eye-catching design etc. it never ceases to amaze me that many of the so-called specialist SEO companies have failed to comply with even the basic tenets of search engine optimisation when designing their own web site.

The basic SEO web site design company testing rules Retiarius suggest you use are:

  • Always make sure the candidate company’s web site is itself fully W3C (World Wide Web Consortium) code compliant. Put the URL of their site, and any of their recent clients’ sites, into the W3C Validator, select ‘Show Outline’ and see what it shows.
  • Check that the CSS (Cascading Style Sheet) commands on their sites are compliant. The W3C CSS validator will do this for you.
  • Always spell-check the candidate’s site’s content. Search engines need to understand the context of the content and poor spelling negates that, so they will penalise badly spelled sites. Spell checking by a web site producer should be an integral part of the design function.
  • Ensure that the candidate’s sites use the <h1>-<h6> HTML tag structure in a manner that optimises their visibility to search engines and enhances the structure of the site so that the text is read in context. The ‘Show Outline’ option on the W3C Validator shows this, as does using the W3C Semantic Checker.

It is therefore important for potential SEO clients to check that the candidate SEO companies themselves have compliant and correctly optimised sites because if they can’t be bothered, or have the skills, to correctly optimise their own site, how can you expect them to know how to do for you!

Related Posts:

Oct
13
2008
0

The holistic approach to designing search engine optimised sites

We at Retiarius are firm believers that you cannot ‘bolt on’ good SEO performance once a site has been made. You have to design the site from the ground up to maximise its search engine optimisation potential.

We call this our ‘holistic’ site design philosphy.  All the elements of the site have to work in a harmonious manner to maximise the SEO potential.

These elements include:

  • Excluding Shockwave and Flash animations and Flash-based navigation from the site – you won’t stand a chance of having a search engine spider your site if you use Flash navigation.
  • Choosing keywords for each individual page, rather than a generic set of keywords for the site used on each page, so that you maximise the keyword density on that page. This takes longer but really improves the SEO performance for that page.
  • Ensuring that the main keywords are reflected in the
    • page title
    • file name
    • meta description
    • first 128 words on the page
    • image titles and alt statements
    • link tags
  • The page is spelled correctly (see article)
  • The page fully meets W3C standards
  • The site is searchable using the W3C semantic checker, as this simulates how the search engines extract data.
  • Using SEO add-ons if designing a blog-based site. WordPress has some very good free SEO plugins for its blogging system.
  • Effective use of subdomains (see article)

We used this holistic approach for our latest site for Buxton Press which will utilise a blog on a subdomain to provide linkbait for its business and provide a news and job vacancies system for the company.

Effective search engine optimisation is a painstaking process but we at Retiarius believe it is worth it.

Related Posts:

Oct
08
2008
0

The benefits of using subdomains to enhance SEO

There are many advantages in using sub domains (e.g. searchengineoptimising.retiarius.com is a subdomain of the retiarius.com primary domain) to enhance the ranking of the primary domain.

Google and MSN both class a subdomain as being a seperate site to that of the subdomain (see this article ).

Barry Schwartz points to a thread at Search Engine Watch Forums discussing the subject:

“Subdomains work very well at the moment. No doubt about that. I can take a prominent, old, domain, set up a brand new subdomain, add one link from the original domains front page, throw up whatever content I want and within days have plenty of traffic. These days it seems that almost all linkpop value from the original domain is transfered – and I see this happening in both MSN and Google.”

Another advantage is that sub domains are usually free to implement on commercial servers – you own the domain and therefore can apply unlimited subdomains to it.

I think that this separation of a subdomain and its primary domains by search engines will have to continue because of the free blogging sites such as blogger all issue their clients a subdomain such as yourname.blogspot.com, and as such cannot exclude subdomains from their search results because ALL the blogs on blogger are owned by seperate individuals.

By using a subdomain for a blog, and cross-linking a new primary site to it you automatically have set up some linking value to the new primary site. If the blog is sued as ‘link bait’ with articles likely to interest a lot of readers then you can increase the cross-traffic to the, probably less frequently updated, primary site.

But you must not go mad at producing endless referring subdomains because because Google has set a bar on the number of subdomains they will reference.

I would like to quote Vanessa Fox, an ex-Googler and contributor to Search Engine Land :

“Google is no longer treating subdomains (blog.widgets.com versus widgets.com) independently, instead attaching some association between them. The ranking algorithms have been tweaked so that pages from multiple subdomains have a much higher relevance bar to clear in order to be shown.”

So, with care the use of subdomains will enhance your site a low cost but don’t over-egg the pudding … it could backfire on your search engine ranking.

Related Posts:

Oct
06
2008
0

W3C compliance – is it important for SEO?

The W3C (World Wide Web Consortium) set the Internationally agreed standards for the languages used for constructing web sites.  The primary coding language still in use is HTML (Hyper Text Markup Lanquage) and was invented by Tim Berners-Lee, the ‘father’ of the internet.

HTML was developed through several versions until it reached 4.01, at which point the language called XHTML (Entensible Hypertext Markup Language)  became the standard to follow.

All browsers are backwards compatible.  i.e. they can read and process all the variants of HTML and XHTML up to the browser’s implementation date.

By creating an International standard for web coding the W3C have enabled browser designers to build systems capable of (almost) reading any website.

The problems arise when certain browser producers have tried to introduce propriety coded operations into their browsers. A prime example of this has been Microsoft which has consistently tried to impose its ASP (Active Server Paging) systems by making Internet Explorer capable of working outside of W3C standards in an attempt to monopolise the server market. They attempted the same by modifying the JAVA programming language with propriety codes until they were successfully sued by Sun Microsystems (the originators of Java), at which point Microsoft unilaterally under the guise of a security update, and without users permission, modified all existing copies of Internet Explorer to remove any form of Java support, immediately crippling many web sites until Sun could implement a rescue strategy.

Unfortunately for Microsoft, and very fortunately for all other web designers, the majority of servers are run on Linux/Unix servers and the free Apache web server.  This has effectively minimised the ‘damage’ Microsoft has done to the concept of a free and open web design language standard as ASP can only run on Microsoft servers.

What has all this to do with search engine optimisation you may ask?

The answer is that search engine designers are moving more and more towards implementing search criteria which mimic human search behaviour. They are looking with semantic checkers within pages for the information to try and ‘understand’ the true content of the page.  To do this they need to be able to read, unambiguously, every statement on the page.

The use of non-standard coding (we shall be dealing with the use of Javascript, Java, Flash and Shockwave in a later article) makes this difficult for the search engine algorithms to do this and so, to save processing power and time, and to encourage standards compliant coding, if the detect non-standard or poorly written coding they tend to rank your site lower because a ‘sloppily’ coded site infers a lack of quality.

A well written, W3C complaint coded site can be fully read by the search engine systems and as such, if two sites of equal stature are to be ranked, one compliant and one not, the compliant site will rank higher.  Which would you choose if you were employing a designer?

So always check that your sites are compliant. W3C have an excellent free HTML/XHTML Validator you can use.

We at Retiarius Internet Design undertand the importance of this and we produce fully W3C complaint code for our clients as standard.

Related Posts:

Oct
04
2008
0

Google spell checking – and how it affects SEO

It has been shown by our experience and from our research that Google’s spidering and indexing systems place great store on the way a site has been constructed. Google appear to penalise sites with spelling errors and sites with non-compliant HTML/XHTML coding.

It can advantageous to check the spelling of some words using Google’s own spell checker. This uses the same lexicon as the search engines and so any misspelled words or word variants that Google thinks are accurate could be used as keywords in your site.

Accurate spelling prevents your site being penalised and increases your keyword density because a misspelled word obviously will not be recognised.

Related Posts:

Oct
03
2008
0

Retiarius Internet produce a fully search engine optimised web site for Buxton Press

Retiarius Internet Design’s latest project has just gone online. Buxton Press, a leading Derbyshire, UK, environmental award-winning printing company employed Retiarius to produce them a cost-effective, SEO enabled web site that will promote their position on search engines with respect to their competitor’s sites.

Retiarius were chosen because they were able to demonstrate that the sites run by Buxton Press’ competitors, though some were looking good, were search engine unfriendly and had virtually no searchable content due to the use of Flash and Shockwave includes which slowed down page loading and prevented the search engines finding searchable content.

In this very competitive marketplace search engine placement is critical for acquiring new ‘off the street’ customers. By ensuring that the Buxton Press site fully met World Wide Web Consortium (W3C) coding criteria, and that the text within the site was optimised for search engine semantic checkers, it is expected that Buxton Press will soon have a top Google ranking.

The new site incorporates its own on-line help system and a blog set on its own subdomain to automatically create a link back to the parent site.  This effective use of subdomains (they don’t cost anything!) enables the setting up of SEO ‘child’ sites to act as feeder sites to the parent site.

Related Posts:

Blog Widget by LinkWithin