Aug
22
2013
0

Rags for Gypsies, a new SEO optimised business website

Rags for Gypsies

Rags for Gypsies

Rags for Gypsies is a new site developed by Retiarius Internet Design to take advantage of the growth in the use of social media such as Twitter and Facebook to drive traffic to a business.

Rags for Gypsies utilises the power of the WordPress platform to provide an easy to use and update web site that the site site owners can maintain and expand themselves.

Their website is linked to both Twitter (@ragsforgypsies) and Facebook (http://www.facebook.com/rags4gypsiesdesigns) and also to their marketing account on Etsy.com (http://www.etsy.com/shop/RagsForGypsies) to provide a comprehensive net exposure for the unique designs produced by Poppy Fields.

Retiarius set up the clients Etsy account to automatically announce new designs simultaneously on both their Facebook and Twitter accounts and any new posts on the main site are also automatically ‘Tweeted’

This site shows what can be done to integrate the social media sites into a Search engine Optimised WordPress site.

Related Posts:

Jun
13
2012
0

LeaCFD like their site revamp and search engine optimisation

LeaCFD.com, the website of the UK’s leading independent computational fluid dynamic specialist Dr. Chris Lea, was recently updated and brought up to current W3C and SEO standards by Retiarius.

Dr. Lea has reported that within days his site is now page 1 listed on Google for the terms “expert witness CFD” and “CFD expert witness”, the terms that he requested we optimise his site for.

Retiarius took his existing site, written in 2004 in HTML 4.0 and revamped it to use fully compliant  XHTML and utilised our several years of SEO experience to  tailor the site to reflect best SEO practise.

To gain page 1 Google ranking for the target search terms in a few days demonstrates that taking care to ensure code compliance and correct use of meta statements and <h1>-<h6> tags etc. can make a significant difference to your Google ranking.

Related Posts:

  • No Related Posts
Jun
25
2011
0

Flash! and their site was gone!

Further to my earlier post on not using frames, a company of my acquaintance had a nicely search engine  optimised site.

It then seems that someone offered to update their site with a new, experimental, ‘jazzy and upbeat’ look. As this was for free they accepted and the new ‘designer’ immediately dumped the existing site and replaced it with their new front-end which basically was a ‘new site coming soon’ and a contact page. They then did not complete the new site but left it unfinished for 4 months.

To compound this mistake the new ‘designer’ used Flash/Shockwave to build the entire new site and provided no words that a search engine could find.

Result, it is now almost impossible to find this company on Google even if you know the company’s name! Total wipe-out!

I don’t know how many times companies are taken in by graphics artists proclaiming to be web designers and the said artist has not a clue as to how the web search engines work.

Established well-known companies can use Flash and Shockwave because they are household names and can afford the best programmers and designers who know how to optimise even a Flash site for SEO. These companies can be found by their trading name, Rolls Royce, Amazon, Ford etc. and not by searching for some product such as ‘luxury car’.

Unfortunately smaller companies need effective search engine optimised sites so they can be found ahead of their competitors for their keywords.

Using Flash and Shockwave for site design provides a desert for search engine bots and I really recommend you do not use them for site design unless you are a household name.

And as  postscript, Apple iPhones and iPads do not support Flash driven sitres – so why alienate a significant section of the  market!

Related Posts:

Written by Retiarius in: Web site design techniques | Tags: , , , ,
Jul
19
2010
0

The Rose and Crown at Allgreave

Retiarius Internet Design have just been commisioned to produce a search engine optimised site for the Rose and Crown public house at Allgreave on the edge of the Peak District.

The original site for the pub had been cleared of content by the previous landlord when he left the business and the original domain name (http://www.theroseandcrown.net) offered for sale at what I believed to be a very reasonable price for such an applicable, and already Google Page 1 ranked domain of several years registration .

On our advice, the new owners were fortunate to, via a third party, quickly purchase this existing domain name from its previous owner and they are now having Retiarius produce a brand new site to take advantage of Google’s preference towards domain names of some age.

As speed was of the essence to prevent Google losing, or reducing, the position for the domain name in its ranks following its closure by the previous owner, a temporary, fully optimised, index page has been put in place on the same day we received the commission.

This page will be  a place holder whilst the final design is discussed, agreed and designed.

It is important in SEO to try and weight the chances of getting good ranking in favour of your client. The purchase of an existing applicable domain name which has a decent ranking can significantly accelerate the SEO process.

Retiarius Internet Design’s experience in such matters can be a valuable resource for its clients.

Related Posts:

  • No Related Posts
May
28
2009
0

Understanding NoFollow tags and what they can do to improve PageRank

Google announced in early 2005 that hyperlinks with rel="nofollow" attribute would not influence the link target’s PageRank. In addition, the Yahoo and Windows Live search engines also respect this attribute.

How the attribute is being interpreted differs between the search engines. While some take it literally and do not follow the link to the page being linked to, others still “follow” the link to find new web pages for indexing. In the latter case rel="nofollow" actually tells a search engine “Don’t score this link” rather than “Don’t follow this link.” This differs from the meaning of nofollow as used within a robots meta tag, which does tell a search engine: “Do not follow any of the hyperlinks in the body of this document.”

The effective usage of the rev=”nofollow” tag can improve selected pages Google PageRank by preventing the leakage of ‘Link Juice’ to non-essential (for site PageRank purposes) pages such Terms and Conditions, Privacy Policy pages etc. which do not require to be indexed by search engines, but which can, if “nofollow” is not used, dilute the optimim PagerRank for your essential pages such as product descriptions etc.

Learning to implement “nofollow” tags is fairly easy. Learning how to apply them in the proper way does require some skill.

The use of “nofollow” tags can serve many different purposes. They can be used to limit the amount of ‘link juice’ that flows out of a page to external pages of different domains, or they can be used to control where the ‘link juice’ will flow to within a site and its internal pages.

This post is about the use of “nofollow” tags to control the amount of link juice flowing within a site and its internal pages. To better explain this I came up with an illustration that should help the “not so technical” crowd understand this process.

If you would, please visualize your homepage as a bucket, and the subpages as sub-buckets. See the image below:

SEO-juice-leakage.jpg

Your total search engine authority can be represented by what I’m going to call SEO Juice”.

Now, imagine that every link you have, in every page of your site, is a hole in the bucket. Once the different Search Engines pour their “SEO Juice” into your homepage bucket, the juice leaks out to your sub-pages, and external pages, through every link you have.

link-juice-leakage2.jpg

The problem is that some of your sub-buckets (sub-pages) don’t need that “SEO Juice” as much, while others need a lot of it. A good example is having those “Privacy Policy”, “Shipping Info” types of pages that really don’t need to rank highly in any SERP. So, instead of spreading your “SEO Juice” thin, you’d direct it to where it is most needed. Your site could have an extremely relevant, and high converting sub-page that you want to boost, this would be a good place to start.

The “nofollow” tags help you plug the holes of different buckets and let most of the juice flow where you want more Search Engine authority. See the image below:

link-juice-leakage-plugged.jpg

Once you’ve drawn the “nofollow” strategy map for your site, and decide on what pages need more search engine authority, the implementation part is quite simple.

Now that you understand what “nofollow” tags can do for your site, make sure you look into taking advantage of this awesome tool, and take control of where your “SEO Juice” is flowing!

Still confused? Let us help you out! Check out our full list of search engine optimization services.

Related Posts:

Oct
29
2008
0

Choosing an appropriate SEO web site design company

Whenever a client approaches us about optimising their site and they say “we have had xyz.com optimise it but we still don’t have placement” I always have a look at xyz.com’s website to see if they have optimised their own site.

Despite what their site may say for human readers, the fancy graphics, Flash animations and eye-catching design etc. it never ceases to amaze me that many of the so-called specialist SEO companies have failed to comply with even the basic tenets of search engine optimisation when designing their own web site.

The basic SEO web site design company testing rules Retiarius suggest you use are:

  • Always make sure the candidate company’s web site is itself fully W3C (World Wide Web Consortium) code compliant. Put the URL of their site, and any of their recent clients’ sites, into the W3C Validator, select ‘Show Outline’ and see what it shows.
  • Check that the CSS (Cascading Style Sheet) commands on their sites are compliant. The W3C CSS validator will do this for you.
  • Always spell-check the candidate’s site’s content. Search engines need to understand the context of the content and poor spelling negates that, so they will penalise badly spelled sites. Spell checking by a web site producer should be an integral part of the design function.
  • Ensure that the candidate’s sites use the <h1>-<h6> HTML tag structure in a manner that optimises their visibility to search engines and enhances the structure of the site so that the text is read in context. The ‘Show Outline’ option on the W3C Validator shows this, as does using the W3C Semantic Checker.

It is therefore important for potential SEO clients to check that the candidate SEO companies themselves have compliant and correctly optimised sites because if they can’t be bothered, or have the skills, to correctly optimise their own site, how can you expect them to know how to do for you!

Related Posts:

Oct
13
2008
0

The holistic approach to designing search engine optimised sites

We at Retiarius are firm believers that you cannot ‘bolt on’ good SEO performance once a site has been made. You have to design the site from the ground up to maximise its search engine optimisation potential.

We call this our ‘holistic’ site design philosphy.  All the elements of the site have to work in a harmonious manner to maximise the SEO potential.

These elements include:

  • Excluding Shockwave and Flash animations and Flash-based navigation from the site – you won’t stand a chance of having a search engine spider your site if you use Flash navigation.
  • Choosing keywords for each individual page, rather than a generic set of keywords for the site used on each page, so that you maximise the keyword density on that page. This takes longer but really improves the SEO performance for that page.
  • Ensuring that the main keywords are reflected in the
    • page title
    • file name
    • meta description
    • first 128 words on the page
    • image titles and alt statements
    • link tags
  • The page is spelled correctly (see article)
  • The page fully meets W3C standards
  • The site is searchable using the W3C semantic checker, as this simulates how the search engines extract data.
  • Using SEO add-ons if designing a blog-based site. WordPress has some very good free SEO plugins for its blogging system.
  • Effective use of subdomains (see article)

We used this holistic approach for our latest site for Buxton Press which will utilise a blog on a subdomain to provide linkbait for its business and provide a news and job vacancies system for the company.

Effective search engine optimisation is a painstaking process but we at Retiarius believe it is worth it.

Related Posts:

Oct
08
2008
0

The benefits of using subdomains to enhance SEO

There are many advantages in using sub domains (e.g. searchengineoptimising.retiarius.com is a subdomain of the retiarius.com primary domain) to enhance the ranking of the primary domain.

Google and MSN both class a subdomain as being a seperate site to that of the subdomain (see this article ).

Barry Schwartz points to a thread at Search Engine Watch Forums discussing the subject:

“Subdomains work very well at the moment. No doubt about that. I can take a prominent, old, domain, set up a brand new subdomain, add one link from the original domains front page, throw up whatever content I want and within days have plenty of traffic. These days it seems that almost all linkpop value from the original domain is transfered – and I see this happening in both MSN and Google.”

Another advantage is that sub domains are usually free to implement on commercial servers – you own the domain and therefore can apply unlimited subdomains to it.

I think that this separation of a subdomain and its primary domains by search engines will have to continue because of the free blogging sites such as blogger all issue their clients a subdomain such as yourname.blogspot.com, and as such cannot exclude subdomains from their search results because ALL the blogs on blogger are owned by seperate individuals.

By using a subdomain for a blog, and cross-linking a new primary site to it you automatically have set up some linking value to the new primary site. If the blog is sued as ‘link bait’ with articles likely to interest a lot of readers then you can increase the cross-traffic to the, probably less frequently updated, primary site.

But you must not go mad at producing endless referring subdomains because because Google has set a bar on the number of subdomains they will reference.

I would like to quote Vanessa Fox, an ex-Googler and contributor to Search Engine Land :

“Google is no longer treating subdomains (blog.widgets.com versus widgets.com) independently, instead attaching some association between them. The ranking algorithms have been tweaked so that pages from multiple subdomains have a much higher relevance bar to clear in order to be shown.”

So, with care the use of subdomains will enhance your site a low cost but don’t over-egg the pudding … it could backfire on your search engine ranking.

Related Posts:

Oct
06
2008
0

W3C compliance – is it important for SEO?

The W3C (World Wide Web Consortium) set the Internationally agreed standards for the languages used for constructing web sites.  The primary coding language still in use is HTML (Hyper Text Markup Lanquage) and was invented by Tim Berners-Lee, the ‘father’ of the internet.

HTML was developed through several versions until it reached 4.01, at which point the language called XHTML (Entensible Hypertext Markup Language)  became the standard to follow.

All browsers are backwards compatible.  i.e. they can read and process all the variants of HTML and XHTML up to the browser’s implementation date.

By creating an International standard for web coding the W3C have enabled browser designers to build systems capable of (almost) reading any website.

The problems arise when certain browser producers have tried to introduce propriety coded operations into their browsers. A prime example of this has been Microsoft which has consistently tried to impose its ASP (Active Server Paging) systems by making Internet Explorer capable of working outside of W3C standards in an attempt to monopolise the server market. They attempted the same by modifying the JAVA programming language with propriety codes until they were successfully sued by Sun Microsystems (the originators of Java), at which point Microsoft unilaterally under the guise of a security update, and without users permission, modified all existing copies of Internet Explorer to remove any form of Java support, immediately crippling many web sites until Sun could implement a rescue strategy.

Unfortunately for Microsoft, and very fortunately for all other web designers, the majority of servers are run on Linux/Unix servers and the free Apache web server.  This has effectively minimised the ‘damage’ Microsoft has done to the concept of a free and open web design language standard as ASP can only run on Microsoft servers.

What has all this to do with search engine optimisation you may ask?

The answer is that search engine designers are moving more and more towards implementing search criteria which mimic human search behaviour. They are looking with semantic checkers within pages for the information to try and ‘understand’ the true content of the page.  To do this they need to be able to read, unambiguously, every statement on the page.

The use of non-standard coding (we shall be dealing with the use of Javascript, Java, Flash and Shockwave in a later article) makes this difficult for the search engine algorithms to do this and so, to save processing power and time, and to encourage standards compliant coding, if the detect non-standard or poorly written coding they tend to rank your site lower because a ‘sloppily’ coded site infers a lack of quality.

A well written, W3C complaint coded site can be fully read by the search engine systems and as such, if two sites of equal stature are to be ranked, one compliant and one not, the compliant site will rank higher.  Which would you choose if you were employing a designer?

So always check that your sites are compliant. W3C have an excellent free HTML/XHTML Validator you can use.

We at Retiarius Internet Design undertand the importance of this and we produce fully W3C complaint code for our clients as standard.

Related Posts: