Jun
04
2009
0

Top 10 must have add-ons and plugins for WordPress

WordPress is one the most popular blogging platforms in use today. It has many advantages for site owners wanting a search engine optimised web site.

  • It is free, open source and therefore secure. Just download WordPress free from here
  • It is easy to use and maintain
  • It runs on Apache web servers
  • It has many fantastic free themes that can be used and modified to produce a look and feel for your site that suits your business.
  • It is almost infinitely customisable by web site designers by using the extensive free plug-ins and widgets to ad functionality and ‘eye-candy’.

This post is about the plug-ins and widgets that I think are the most useful. So here they are in alphabetical order.

  1. AddThis Social Bookmark
    Help your visitor promote your site! The AddThis Social Bookmarking Widget allows any visitor to bookmark your site easily with many popular services. Sign up for an AddThis.com account to see how your visitors are sharing your content–which services they’re using for sharing, which content is shared the most, and more. It’s all free–even the pretty charts and graphs. By The AddThis Team.
  2. All in One SEO Pack
    This plugin will optimise your WordPress blog so that your posts and pages are indexed more efficiently by the search engines. It is fully featured and lets you single out individual pages for optimisation.
  3. Akismet: the best comment spam protection available. It fortunately comes pre-installed, though you’ll still need an API key from WordPress.com. This system will prevent your site being deluged with comment spam. If some should get through it makes it easy to kill the offending comments and also notify the Akismet system so that others are protected, just ast they are protecting you.
  4. Events Calendar
    This plugin and wdget makes it easy to embed appointments and events into your blog.  So if you are organising exhibitions, conferences or training days this plugin is for you. It can automatically produce a blog post from the data you input for the event and so save you time when working on your blog.
  5. FeedBurner FeedSmith: Steve Smith originally wrote this plugin to make it easy to use Google FeedBurner for syndicating your blog. Google FeedBurner has taken over its maintenance. Very easy to use.
  6. Google Analystics for WordPress
    This plugin makes it simple to add Google Analytics tracking to your WordPress blog. Google Analytics is a very powerful incoming click tracker, allowing to deep  analyse the performance of your site and see where your traffic is coming from.
  7. Google XML Sitemaps
    If you don’t know how beneficial Google Sitemaps can be to your blog then read my articles on SEO. Once set up this plugin will inform Google of any changes or additions made to your site automatically. You can download Google XML Sitemaps from here
  8. Share This: Alex King creates incredibly useful plugins and this is one of them. If you want to make it easy for your visitors to share your posts on bookmarking or social network sites, this is the one plugin you need.
  9. NextGen Image Gallery
    This plugin by Alex Rabe makes the production of in-line image galleries in your blog posts easy and effective. The in-built slideshow widget (helper application) aloows you to embed an auto-rotating  image slideshow into the side panels of your blog.
  10. WP-Cumulus
    Flash based Tag Cloud for WordPress. A bit of eye-candy which displays a graphical representation of post tags.

So there you have my current top 10 plugins and additions to WordPress.

Related Posts:

May
28
2009
0

Understanding NoFollow tags and what they can do to improve PageRank

Google announced in early 2005 that hyperlinks with rel="nofollow" attribute would not influence the link target’s PageRank. In addition, the Yahoo and Windows Live search engines also respect this attribute.

How the attribute is being interpreted differs between the search engines. While some take it literally and do not follow the link to the page being linked to, others still “follow” the link to find new web pages for indexing. In the latter case rel="nofollow" actually tells a search engine “Don’t score this link” rather than “Don’t follow this link.” This differs from the meaning of nofollow as used within a robots meta tag, which does tell a search engine: “Do not follow any of the hyperlinks in the body of this document.”

The effective usage of the rev=”nofollow” tag can improve selected pages Google PageRank by preventing the leakage of ‘Link Juice’ to non-essential (for site PageRank purposes) pages such Terms and Conditions, Privacy Policy pages etc. which do not require to be indexed by search engines, but which can, if “nofollow” is not used, dilute the optimim PagerRank for your essential pages such as product descriptions etc.

Learning to implement “nofollow” tags is fairly easy. Learning how to apply them in the proper way does require some skill.

The use of “nofollow” tags can serve many different purposes. They can be used to limit the amount of ‘link juice’ that flows out of a page to external pages of different domains, or they can be used to control where the ‘link juice’ will flow to within a site and its internal pages.

This post is about the use of “nofollow” tags to control the amount of link juice flowing within a site and its internal pages. To better explain this I came up with an illustration that should help the “not so technical” crowd understand this process.

If you would, please visualize your homepage as a bucket, and the subpages as sub-buckets. See the image below:

SEO-juice-leakage.jpg

Your total search engine authority can be represented by what I’m going to call SEO Juice”.

Now, imagine that every link you have, in every page of your site, is a hole in the bucket. Once the different Search Engines pour their “SEO Juice” into your homepage bucket, the juice leaks out to your sub-pages, and external pages, through every link you have.

link-juice-leakage2.jpg

The problem is that some of your sub-buckets (sub-pages) don’t need that “SEO Juice” as much, while others need a lot of it. A good example is having those “Privacy Policy”, “Shipping Info” types of pages that really don’t need to rank highly in any SERP. So, instead of spreading your “SEO Juice” thin, you’d direct it to where it is most needed. Your site could have an extremely relevant, and high converting sub-page that you want to boost, this would be a good place to start.

The “nofollow” tags help you plug the holes of different buckets and let most of the juice flow where you want more Search Engine authority. See the image below:

link-juice-leakage-plugged.jpg

Once you’ve drawn the “nofollow” strategy map for your site, and decide on what pages need more search engine authority, the implementation part is quite simple.

Now that you understand what “nofollow” tags can do for your site, make sure you look into taking advantage of this awesome tool, and take control of where your “SEO Juice” is flowing!

Still confused? Let us help you out! Check out our full list of search engine optimization services.

Related Posts:

May
09
2009
0

Fully SEO PHP and blog script-driven site for Elanga International

Elaine Walker of Elanga International Language Services already had a web site for her language translation business but wanted it fully redesigned to be search engine optimised. The new site had to be easily maintained by herself and to use Web2.0 techniques.  Her prime concern was effective search engine optimisation of the new site.

As with Ford & Kidd an optimised front page and blog solution was used as this provided the most cost-effective solution. The front page incorporates PHP code that posts SEO optimised resumés of posts made in the WordPress blog which comprises the majority of the site. The PHP code produces fully W3 complaint XHTML code, the result being a site index page with automatically updated content that cannot be detected as such by search engines as being produced by a script.

Related Posts:

May
01
2009
0

Retiarius produce a SEO blog-driven site for Ford & Kidd Truck Bodies

Retiarius Internet Design were approached by  local custom truck body builder Ford & Kidd  to produce their first web site. Ford & Kidd specialise in producing one-off truck body solutions for both new-build and vehicle conversion projects.

Ford & Kidd wanted a web site that they could easily maintain themselves but one that also would be fully search engine optimised.

We decided that the best, and most cost effective, solution was to build a front end to the site which was fully optimised for search engines, incorporating ‘hidden’  PHP code into that page that posted resumés of post made in the WordPress blog which comprises the majority of the site.

The blog software was set up to include effective SEO and also permit the loading of individual photographs and galleries of photographs of vehicles that have been constructed by Ford & Kidd that could be displayed within the posts.

Ford and Kidd Truck Bodies

Related Posts:

Nov
17
2008
0

The use of blogs for traffic generation

Blogs Attract More Search Engine Traffic

Your favorite thing about having a blog may soon be this – they naturally attract search engine traffic. Blogs already have optimized site architecture. Most are set up with a clear navigation, where every page is set up to link back to the other main pages. They also have the inherent potential to be well-linked.

Retiarius Internet Design have been successfully using blogs to generate sustained traffic to clients sites (e.g. for a French villa holiday company ) and by installing a blog on a subdomain of your domain you automatically get some cross-site linking. The effective use of blogs generates ‘link bait’ for other blog owners to want to link to you.

Blog Directories and Site Submission

If you haven’t already submitted to blog directories, you are missing out on some great one-way links. Many of the top directories can be found on Robin Good’s Top 55 list. But before you head over there and start submitting, you should know a little about how to optimize your blog. Then your new listings can help your site get the best keyword placement in the major search engines.

Tip One: Blogs and SEO – Keywords

You have a choice. You can target a general high traffic keyword you have little chance of ranking well for and get barely any traffic. Or you can shoot for a keyword that gets a moderate level of targeted traffic resulting in more subscribers and sales. I like to call this a “lucrative keyword”. Whatever you call them, here’s the most important thing: They may not get you the most traffic, but they often bring the most profit.

More Web Site Traffic and More Sales? Not Always

You may be surprised to learn that there isn’t always a correlation between high traffic and high sales. Many of the most profitable sites in the world get moderate traffic because their lucrative keywords result in a much higher ratio of visitors to buyers.

Length of Search Query is a Factor

A recent article in Information Week stated that the highest conversion rates from search engine traffic comes from people who do four word queries. The great thing about your blog is that it can get so well-indexed that you have the potential to show up for any number of four word phrases that are relevant to your industry.

Target Your Blog for More Traffic and Sales

It isn’t just the four word phrases that get converting traffic – there are two and three word phrases that can bring you traffic and sales. Targeting your blog discussion to a two or three word phrase that has a high yield of traffic, and yet has little competition, is not a dream of past Internet days. Another recent study revealed that surprisingly high percentages of search engine queries debuted as late as 2004. As long as there are new developments, new products, services and trends, you’ll never have a shortage of these terms if you learn how to discover them.

(more…)

Related Posts:

Written by admin in: Search Engine Optimisation Techniques | Tags: , , , ,
Nov
11
2008
0

Use and misuse of metadata statements for SEO

META Tags or what are officially referred to as Metadata Elements, are found within the head section of your web pages. Meta tags are still relevant with some indexing search engines. Your site should utilize your meta tags in accordance with the W3C – World Wide Web Consortium Metadata Specifications.

Here is a good introductory article about META tags and metadata statements.

Search engines use an algorithm to determine what the most relevant content on the have Internet. They crawl through websites, looking for occurrences of keywords. Websites usually have metadata tags describing the content of the website. However, due to misuse, the sole reliance on metadata has diminished considerably. A search engine will check the occurrences of the keywords within the body of the text, and also the time it was last updated. All these factors contribute to what is known as the website’s page rank, which in turn determines how high up the website goes in the results.

Retiarius design sites that comply with the W3C standards, thus enhancing your search engine visibility.

Related Posts:

Nov
02
2008
0

SEO companies that use link farms to gain placement – a good or bad idea?

Google started the trend, but more and more search engines are following suit and using link popularity as an important part of their ranking algorithms. Many webmasters have responded by joining link farms and stuffing their sites with as many links as possible. This worked for a while regarding site placement but now the search engine designers are getting wise to this tactic.  It is fact that not all links are not created equal. In fact, bad linking strategies may get you banned from some engines.

Popularity Doesn’t Grow On Farms

A link farm consists of sites that link to other sites for the sole purpose of increasing their link popularity score. Unlike perfectly valid links to web sites with related information, sites that utilise link farming contain links to sites totally unrelated to your business. This practice is also referred to as link stuffing.

Google’s programmers hates link farms and there search engine’s algorithms label the links they generate as spam. In fact, Google hates them so much that some sites get removed from the index if they’re affiliated with link farms. Some webmasters are getting so paranoid that they are considering removing all outbound links from their sites – something which will tend to negate the very essence of using the web – hyperlinks to related information.

Search Engines Take Hard Line On Link Stuffing

Search engines have gotten wise to the ‘link spamming’ techniques using by link farms and are now taking a hard line against them.

AllTheWeb’s spam policy is to:

disregard Link Stuffing when building the index and computing static rank, and to reduce the static rank of the documents containing it.

Google warns against “artificial linkage” (translation: link farms and link stuffing) and warns that if you link to spam sites, Google’s algorithm may penalize your site.

AltaVista’s spam policy warns that sites may be deleted from the index if they “contain only links to other pages.”

Inktomi’s editorial guidelines warn against “excessively cross-linking sites to inflate a site’s apparent popularity.”

Search engines won’t penalize you for good links, but they’ve gotten pretty good at recognizing bogus ones and are quick to punish sites that try to spam them with unrelated links.

This is an overreaction that decreases the site value to visitors and hurts the Web in general because cross-linking is a basic tenet of the Internet. Links are fine – even encouraged – if they are related to your topic, but link farms rarely provide useful content to visitors.

If your site is devoted to your favourite pop band and you include links to the band members’ personal sites, other fan sites, and links to stores that sell the band’s music, that’s not a link farm. You’re providing access to other sites that will probably interest your visitors.

But if you signed up with a so-called SEO service that promises to generate two hundred inbound links to your site only if you agree to add two hundred outbound links in return, then you’ve  created a link farm. Instead of linking to related information of value to your visitors, you’re instead sending them to sites about herbal supplements, children’s clothes, pet rats, and other totally unrelated (possibly X-rated) topics.

Use Effective Linking Strategies

We can sum up a good linking strategy in a single sentence. “Link to related sites that offer content, products, and services that will help and interest your visitors“.

It’s really easy to do:

  • Stay with your theme. Keep your links tightly focused on your site’s topic and you’ll avoid diluting the theme of your site. A coherent theme increases your search engine rank and helps visitors find the information they need.
  • Hang out with the popular crowd. Search engines don’t just count the number of links; they evaluate the relative importance of inbound and outbound links. That’s why link farms don’t help you! It’s far better to have 100 good inbound links from popular pages than 1000 links from a link farm stuffed with spam sites.
  • Avoid automated link generators. Automated programs that send emails to other webmasters that offer to trade links are rarely effective. A good linking strategy has to be personalized, so automated email messages that say: “Hi! Want to trade links?” rarely pay off. If you want to trade links with a site, send the webmaster a personalized email that briefly outlines why you like the site and how a link to your site will enhance it.
  • Put links in context. Links that are included in page content and contain the page’s keywords are more valuable than links exiled to a site’s “Related Links” page. Content links help search engines rank the importance of a link and put it in the proper context.Here are two different ways to link to the same site: Retiarius.com or cost-effective web site designer in Buxton, Derbyshire. The second link describes the company and contains the company’s important keywords.Try to identify a particular page where a link to your site would fit right into the page’s content and note the exact location when you ask for a link. That shows the webmaster that you actually took the time to look at and evaluate the site.

A good linking strategy will help turn your Web site into an authority on a particular topic. Visitors will  tend to come to you for information and search engine spiders will gobble up your content. You won’t have to spend time chasing after good links, they’ll come looking for you.

Worried about other search engine promotion techniques that might hurt more than they help? Retiarius Internet Design can help you sort through various search engine do’s and don’ts.  We provide a cost-effective, personalised, friendly service to help boost your site to the top of the rankings.

Related Posts:

Oct
29
2008
0

Choosing an appropriate SEO web site design company

Whenever a client approaches us about optimising their site and they say “we have had xyz.com optimise it but we still don’t have placement” I always have a look at xyz.com’s website to see if they have optimised their own site.

Despite what their site may say for human readers, the fancy graphics, Flash animations and eye-catching design etc. it never ceases to amaze me that many of the so-called specialist SEO companies have failed to comply with even the basic tenets of search engine optimisation when designing their own web site.

The basic SEO web site design company testing rules Retiarius suggest you use are:

  • Always make sure the candidate company’s web site is itself fully W3C (World Wide Web Consortium) code compliant. Put the URL of their site, and any of their recent clients’ sites, into the W3C Validator, select ‘Show Outline’ and see what it shows.
  • Check that the CSS (Cascading Style Sheet) commands on their sites are compliant. The W3C CSS validator will do this for you.
  • Always spell-check the candidate’s site’s content. Search engines need to understand the context of the content and poor spelling negates that, so they will penalise badly spelled sites. Spell checking by a web site producer should be an integral part of the design function.
  • Ensure that the candidate’s sites use the <h1>-<h6> HTML tag structure in a manner that optimises their visibility to search engines and enhances the structure of the site so that the text is read in context. The ‘Show Outline’ option on the W3C Validator shows this, as does using the W3C Semantic Checker.

It is therefore important for potential SEO clients to check that the candidate SEO companies themselves have compliant and correctly optimised sites because if they can’t be bothered, or have the skills, to correctly optimise their own site, how can you expect them to know how to do for you!

Related Posts:

Oct
13
2008
0

The holistic approach to designing search engine optimised sites

We at Retiarius are firm believers that you cannot ‘bolt on’ good SEO performance once a site has been made. You have to design the site from the ground up to maximise its search engine optimisation potential.

We call this our ‘holistic’ site design philosphy.  All the elements of the site have to work in a harmonious manner to maximise the SEO potential.

These elements include:

  • Excluding Shockwave and Flash animations and Flash-based navigation from the site – you won’t stand a chance of having a search engine spider your site if you use Flash navigation.
  • Choosing keywords for each individual page, rather than a generic set of keywords for the site used on each page, so that you maximise the keyword density on that page. This takes longer but really improves the SEO performance for that page.
  • Ensuring that the main keywords are reflected in the
    • page title
    • file name
    • meta description
    • first 128 words on the page
    • image titles and alt statements
    • link tags
  • The page is spelled correctly (see article)
  • The page fully meets W3C standards
  • The site is searchable using the W3C semantic checker, as this simulates how the search engines extract data.
  • Using SEO add-ons if designing a blog-based site. WordPress has some very good free SEO plugins for its blogging system.
  • Effective use of subdomains (see article)

We used this holistic approach for our latest site for Buxton Press which will utilise a blog on a subdomain to provide linkbait for its business and provide a news and job vacancies system for the company.

Effective search engine optimisation is a painstaking process but we at Retiarius believe it is worth it.

Related Posts:

Oct
08
2008
0

The benefits of using subdomains to enhance SEO

There are many advantages in using sub domains (e.g. searchengineoptimising.retiarius.com is a subdomain of the retiarius.com primary domain) to enhance the ranking of the primary domain.

Google and MSN both class a subdomain as being a seperate site to that of the subdomain (see this article ).

Barry Schwartz points to a thread at Search Engine Watch Forums discussing the subject:

“Subdomains work very well at the moment. No doubt about that. I can take a prominent, old, domain, set up a brand new subdomain, add one link from the original domains front page, throw up whatever content I want and within days have plenty of traffic. These days it seems that almost all linkpop value from the original domain is transfered – and I see this happening in both MSN and Google.”

Another advantage is that sub domains are usually free to implement on commercial servers – you own the domain and therefore can apply unlimited subdomains to it.

I think that this separation of a subdomain and its primary domains by search engines will have to continue because of the free blogging sites such as blogger all issue their clients a subdomain such as yourname.blogspot.com, and as such cannot exclude subdomains from their search results because ALL the blogs on blogger are owned by seperate individuals.

By using a subdomain for a blog, and cross-linking a new primary site to it you automatically have set up some linking value to the new primary site. If the blog is sued as ‘link bait’ with articles likely to interest a lot of readers then you can increase the cross-traffic to the, probably less frequently updated, primary site.

But you must not go mad at producing endless referring subdomains because because Google has set a bar on the number of subdomains they will reference.

I would like to quote Vanessa Fox, an ex-Googler and contributor to Search Engine Land :

“Google is no longer treating subdomains (blog.widgets.com versus widgets.com) independently, instead attaching some association between them. The ranking algorithms have been tweaked so that pages from multiple subdomains have a much higher relevance bar to clear in order to be shown.”

So, with care the use of subdomains will enhance your site a low cost but don’t over-egg the pudding … it could backfire on your search engine ranking.

Related Posts:

Blog Widget by LinkWithin