Nov
11
2008
0

Use and misuse of metadata statements for SEO

META Tags or what are officially referred to as Metadata Elements, are found within the head section of your web pages. Meta tags are still relevant with some indexing search engines. Your site should utilize your meta tags in accordance with the W3C – World Wide Web Consortium Metadata Specifications.

Here is a good introductory article about META tags and metadata statements.

Search engines use an algorithm to determine what the most relevant content on the have Internet. They crawl through websites, looking for occurrences of keywords. Websites usually have metadata tags describing the content of the website. However, due to misuse, the sole reliance on metadata has diminished considerably. A search engine will check the occurrences of the keywords within the body of the text, and also the time it was last updated. All these factors contribute to what is known as the website’s page rank, which in turn determines how high up the website goes in the results.

Retiarius design sites that comply with the W3C standards, thus enhancing your search engine visibility.

Related Posts:

Nov
02
2008
0

SEO companies that use link farms to gain placement – a good or bad idea?

Google started the trend, but more and more search engines are following suit and using link popularity as an important part of their ranking algorithms. Many webmasters have responded by joining link farms and stuffing their sites with as many links as possible. This worked for a while regarding site placement but now the search engine designers are getting wise to this tactic.  It is fact that not all links are not created equal. In fact, bad linking strategies may get you banned from some engines.

Popularity Doesn’t Grow On Farms

A link farm consists of sites that link to other sites for the sole purpose of increasing their link popularity score. Unlike perfectly valid links to web sites with related information, sites that utilise link farming contain links to sites totally unrelated to your business. This practice is also referred to as link stuffing.

Google’s programmers hates link farms and there search engine’s algorithms label the links they generate as spam. In fact, Google hates them so much that some sites get removed from the index if they’re affiliated with link farms. Some webmasters are getting so paranoid that they are considering removing all outbound links from their sites – something which will tend to negate the very essence of using the web – hyperlinks to related information.

Search Engines Take Hard Line On Link Stuffing

Search engines have gotten wise to the ‘link spamming’ techniques using by link farms and are now taking a hard line against them.

AllTheWeb’s spam policy is to:

disregard Link Stuffing when building the index and computing static rank, and to reduce the static rank of the documents containing it.

Google warns against “artificial linkage” (translation: link farms and link stuffing) and warns that if you link to spam sites, Google’s algorithm may penalize your site.

AltaVista’s spam policy warns that sites may be deleted from the index if they “contain only links to other pages.”

Inktomi’s editorial guidelines warn against “excessively cross-linking sites to inflate a site’s apparent popularity.”

Search engines won’t penalize you for good links, but they’ve gotten pretty good at recognizing bogus ones and are quick to punish sites that try to spam them with unrelated links.

This is an overreaction that decreases the site value to visitors and hurts the Web in general because cross-linking is a basic tenet of the Internet. Links are fine – even encouraged – if they are related to your topic, but link farms rarely provide useful content to visitors.

If your site is devoted to your favourite pop band and you include links to the band members’ personal sites, other fan sites, and links to stores that sell the band’s music, that’s not a link farm. You’re providing access to other sites that will probably interest your visitors.

But if you signed up with a so-called SEO service that promises to generate two hundred inbound links to your site only if you agree to add two hundred outbound links in return, then you’ve  created a link farm. Instead of linking to related information of value to your visitors, you’re instead sending them to sites about herbal supplements, children’s clothes, pet rats, and other totally unrelated (possibly X-rated) topics.

Use Effective Linking Strategies

We can sum up a good linking strategy in a single sentence. “Link to related sites that offer content, products, and services that will help and interest your visitors“.

It’s really easy to do:

  • Stay with your theme. Keep your links tightly focused on your site’s topic and you’ll avoid diluting the theme of your site. A coherent theme increases your search engine rank and helps visitors find the information they need.
  • Hang out with the popular crowd. Search engines don’t just count the number of links; they evaluate the relative importance of inbound and outbound links. That’s why link farms don’t help you! It’s far better to have 100 good inbound links from popular pages than 1000 links from a link farm stuffed with spam sites.
  • Avoid automated link generators. Automated programs that send emails to other webmasters that offer to trade links are rarely effective. A good linking strategy has to be personalized, so automated email messages that say: “Hi! Want to trade links?” rarely pay off. If you want to trade links with a site, send the webmaster a personalized email that briefly outlines why you like the site and how a link to your site will enhance it.
  • Put links in context. Links that are included in page content and contain the page’s keywords are more valuable than links exiled to a site’s “Related Links” page. Content links help search engines rank the importance of a link and put it in the proper context.Here are two different ways to link to the same site: Retiarius.com or cost-effective web site designer in Buxton, Derbyshire. The second link describes the company and contains the company’s important keywords.Try to identify a particular page where a link to your site would fit right into the page’s content and note the exact location when you ask for a link. That shows the webmaster that you actually took the time to look at and evaluate the site.

A good linking strategy will help turn your Web site into an authority on a particular topic. Visitors will  tend to come to you for information and search engine spiders will gobble up your content. You won’t have to spend time chasing after good links, they’ll come looking for you.

Worried about other search engine promotion techniques that might hurt more than they help? Retiarius Internet Design can help you sort through various search engine do’s and don’ts.  We provide a cost-effective, personalised, friendly service to help boost your site to the top of the rankings.

Related Posts:

Oct
29
2008
0

Choosing an appropriate SEO web site design company

Whenever a client approaches us about optimising their site and they say “we have had xyz.com optimise it but we still don’t have placement” I always have a look at xyz.com’s website to see if they have optimised their own site.

Despite what their site may say for human readers, the fancy graphics, Flash animations and eye-catching design etc. it never ceases to amaze me that many of the so-called specialist SEO companies have failed to comply with even the basic tenets of search engine optimisation when designing their own web site.

The basic SEO web site design company testing rules Retiarius suggest you use are:

  • Always make sure the candidate company’s web site is itself fully W3C (World Wide Web Consortium) code compliant. Put the URL of their site, and any of their recent clients’ sites, into the W3C Validator, select ‘Show Outline’ and see what it shows.
  • Check that the CSS (Cascading Style Sheet) commands on their sites are compliant. The W3C CSS validator will do this for you.
  • Always spell-check the candidate’s site’s content. Search engines need to understand the context of the content and poor spelling negates that, so they will penalise badly spelled sites. Spell checking by a web site producer should be an integral part of the design function.
  • Ensure that the candidate’s sites use the <h1>-<h6> HTML tag structure in a manner that optimises their visibility to search engines and enhances the structure of the site so that the text is read in context. The ‘Show Outline’ option on the W3C Validator shows this, as does using the W3C Semantic Checker.

It is therefore important for potential SEO clients to check that the candidate SEO companies themselves have compliant and correctly optimised sites because if they can’t be bothered, or have the skills, to correctly optimise their own site, how can you expect them to know how to do for you!

Related Posts:

Oct
13
2008
0

The holistic approach to designing search engine optimised sites

We at Retiarius are firm believers that you cannot ‘bolt on’ good SEO performance once a site has been made. You have to design the site from the ground up to maximise its search engine optimisation potential.

We call this our ‘holistic’ site design philosphy.  All the elements of the site have to work in a harmonious manner to maximise the SEO potential.

These elements include:

  • Excluding Shockwave and Flash animations and Flash-based navigation from the site – you won’t stand a chance of having a search engine spider your site if you use Flash navigation.
  • Choosing keywords for each individual page, rather than a generic set of keywords for the site used on each page, so that you maximise the keyword density on that page. This takes longer but really improves the SEO performance for that page.
  • Ensuring that the main keywords are reflected in the
    • page title
    • file name
    • meta description
    • first 128 words on the page
    • image titles and alt statements
    • link tags
  • The page is spelled correctly (see article)
  • The page fully meets W3C standards
  • The site is searchable using the W3C semantic checker, as this simulates how the search engines extract data.
  • Using SEO add-ons if designing a blog-based site. WordPress has some very good free SEO plugins for its blogging system.
  • Effective use of subdomains (see article)

We used this holistic approach for our latest site for Buxton Press which will utilise a blog on a subdomain to provide linkbait for its business and provide a news and job vacancies system for the company.

Effective search engine optimisation is a painstaking process but we at Retiarius believe it is worth it.

Related Posts:

Oct
08
2008
0

The benefits of using subdomains to enhance SEO

There are many advantages in using sub domains (e.g. searchengineoptimising.retiarius.com is a subdomain of the retiarius.com primary domain) to enhance the ranking of the primary domain.

Google and MSN both class a subdomain as being a seperate site to that of the subdomain (see this article ).

Barry Schwartz points to a thread at Search Engine Watch Forums discussing the subject:

“Subdomains work very well at the moment. No doubt about that. I can take a prominent, old, domain, set up a brand new subdomain, add one link from the original domains front page, throw up whatever content I want and within days have plenty of traffic. These days it seems that almost all linkpop value from the original domain is transfered – and I see this happening in both MSN and Google.”

Another advantage is that sub domains are usually free to implement on commercial servers – you own the domain and therefore can apply unlimited subdomains to it.

I think that this separation of a subdomain and its primary domains by search engines will have to continue because of the free blogging sites such as blogger all issue their clients a subdomain such as yourname.blogspot.com, and as such cannot exclude subdomains from their search results because ALL the blogs on blogger are owned by seperate individuals.

By using a subdomain for a blog, and cross-linking a new primary site to it you automatically have set up some linking value to the new primary site. If the blog is sued as ‘link bait’ with articles likely to interest a lot of readers then you can increase the cross-traffic to the, probably less frequently updated, primary site.

But you must not go mad at producing endless referring subdomains because because Google has set a bar on the number of subdomains they will reference.

I would like to quote Vanessa Fox, an ex-Googler and contributor to Search Engine Land :

“Google is no longer treating subdomains (blog.widgets.com versus widgets.com) independently, instead attaching some association between them. The ranking algorithms have been tweaked so that pages from multiple subdomains have a much higher relevance bar to clear in order to be shown.”

So, with care the use of subdomains will enhance your site a low cost but don’t over-egg the pudding … it could backfire on your search engine ranking.

Related Posts:

Oct
06
2008
0

W3C compliance – is it important for SEO?

The W3C (World Wide Web Consortium) set the Internationally agreed standards for the languages used for constructing web sites.  The primary coding language still in use is HTML (Hyper Text Markup Lanquage) and was invented by Tim Berners-Lee, the ‘father’ of the internet.

HTML was developed through several versions until it reached 4.01, at which point the language called XHTML (Entensible Hypertext Markup Language)  became the standard to follow.

All browsers are backwards compatible.  i.e. they can read and process all the variants of HTML and XHTML up to the browser’s implementation date.

By creating an International standard for web coding the W3C have enabled browser designers to build systems capable of (almost) reading any website.

The problems arise when certain browser producers have tried to introduce propriety coded operations into their browsers. A prime example of this has been Microsoft which has consistently tried to impose its ASP (Active Server Paging) systems by making Internet Explorer capable of working outside of W3C standards in an attempt to monopolise the server market. They attempted the same by modifying the JAVA programming language with propriety codes until they were successfully sued by Sun Microsystems (the originators of Java), at which point Microsoft unilaterally under the guise of a security update, and without users permission, modified all existing copies of Internet Explorer to remove any form of Java support, immediately crippling many web sites until Sun could implement a rescue strategy.

Unfortunately for Microsoft, and very fortunately for all other web designers, the majority of servers are run on Linux/Unix servers and the free Apache web server.  This has effectively minimised the ‘damage’ Microsoft has done to the concept of a free and open web design language standard as ASP can only run on Microsoft servers.

What has all this to do with search engine optimisation you may ask?

The answer is that search engine designers are moving more and more towards implementing search criteria which mimic human search behaviour. They are looking with semantic checkers within pages for the information to try and ‘understand’ the true content of the page.  To do this they need to be able to read, unambiguously, every statement on the page.

The use of non-standard coding (we shall be dealing with the use of Javascript, Java, Flash and Shockwave in a later article) makes this difficult for the search engine algorithms to do this and so, to save processing power and time, and to encourage standards compliant coding, if the detect non-standard or poorly written coding they tend to rank your site lower because a ‘sloppily’ coded site infers a lack of quality.

A well written, W3C complaint coded site can be fully read by the search engine systems and as such, if two sites of equal stature are to be ranked, one compliant and one not, the compliant site will rank higher.  Which would you choose if you were employing a designer?

So always check that your sites are compliant. W3C have an excellent free HTML/XHTML Validator you can use.

We at Retiarius Internet Design undertand the importance of this and we produce fully W3C complaint code for our clients as standard.

Related Posts:

Oct
04
2008
0

Google spell checking – and how it affects SEO

It has been shown by our experience and from our research that Google’s spidering and indexing systems place great store on the way a site has been constructed. Google appear to penalise sites with spelling errors and sites with non-compliant HTML/XHTML coding.

It can advantageous to check the spelling of some words using Google’s own spell checker. This uses the same lexicon as the search engines and so any misspelled words or word variants that Google thinks are accurate could be used as keywords in your site.

Accurate spelling prevents your site being penalised and increases your keyword density because a misspelled word obviously will not be recognised.

Related Posts:

Oct
03
2008
0

Retiarius Internet produce a fully search engine optimised web site for Buxton Press

Retiarius Internet Design’s latest project has just gone online. Buxton Press, a leading Derbyshire, UK, environmental award-winning printing company employed Retiarius to produce them a cost-effective, SEO enabled web site that will promote their position on search engines with respect to their competitor’s sites.

Retiarius were chosen because they were able to demonstrate that the sites run by Buxton Press’ competitors, though some were looking good, were search engine unfriendly and had virtually no searchable content due to the use of Flash and Shockwave includes which slowed down page loading and prevented the search engines finding searchable content.

In this very competitive marketplace search engine placement is critical for acquiring new ‘off the street’ customers. By ensuring that the Buxton Press site fully met World Wide Web Consortium (W3C) coding criteria, and that the text within the site was optimised for search engine semantic checkers, it is expected that Buxton Press will soon have a top Google ranking.

The new site incorporates its own on-line help system and a blog set on its own subdomain to automatically create a link back to the parent site.  This effective use of subdomains (they don’t cost anything!) enables the setting up of SEO ‘child’ sites to act as feeder sites to the parent site.

[singlepic=1,250,188,,]

Related Posts:

Sep
03
2008
0

Welcome to search engine optimisation

Welcome to this blog on search engine optimisation techniques. Retiarius Internet Design have been designing web sites since 2001. Owned by Andrew Jeffcock, a former UK Government research scientist, explosives engineer! and computer programmer, Retiarius Internet Design have been developing sites for small to medium businesses and individual clients, specialising in gaining enhanced search engine placement by utilising World Wide Web Consortium (W3C) compliant coding with Google-friendly SEO techniques.

Based in Buxton, Derbyshire Retiarius provide a personal and rapid response service for its clients.

We intend this blog to provide an insight into some of the SEO techniques we employ.

Related Posts: