The W3C (World Wide Web Consortium) set the Internationally agreed standards for the languages used for constructing web sites. The primary coding language still in use is HTML (Hyper Text Markup Lanquage) and was invented by Tim Berners-Lee, the ‘father’ of the internet.
HTML was developed through several versions until it reached 4.01, at which point the language called XHTML (Entensible Hypertext Markup Language)Â became the standard to follow.
All browsers are backwards compatible. i.e. they can read and process all the variants of HTML and XHTML up to the browser’s implementation date.
By creating an International standard for web coding the W3C have enabled browser designers to build systems capable of (almost) reading any website.
The problems arise when certain browser producers have tried to introduce propriety coded operations into their browsers. A prime example of this has been Microsoft which has consistently tried to impose its ASP (Active Server Paging) systems by making Internet Explorer capable of working outside of W3C standards in an attempt to monopolise the server market. They attempted the same by modifying the JAVA programming language with propriety codes until they were successfully sued by Sun Microsystems (the originators of Java), at which point Microsoft unilaterally under the guise of a security update, and without users permission, modified all existing copies of Internet Explorer to remove any form of Java support, immediately crippling many web sites until Sun could implement a rescue strategy.
Unfortunately for Microsoft, and very fortunately for all other web designers, the majority of servers are run on Linux/Unix servers and the free Apache web server. This has effectively minimised the ‘damage’ Microsoft has done to the concept of a free and open web design language standard as ASP can only run on Microsoft servers.
What has all this to do with search engine optimisation you may ask?
The answer is that search engine designers are moving more and more towards implementing search criteria which mimic human search behaviour. They are looking with semantic checkers within pages for the information to try and ‘understand’ the true content of the page. To do this they need to be able to read, unambiguously, every statement on the page.
The use of non-standard coding (we shall be dealing with the use of Javascript, Java, Flash and Shockwave in a later article) makes this difficult for the search engine algorithms to do this and so, to save processing power and time, and to encourage standards compliant coding, if the detect non-standard or poorly written coding they tend to rank your site lower because a ‘sloppily’ coded site infers a lack of quality.
A well written, W3C complaint coded site can be fully read by the search engine systems and as such, if two sites of equal stature are to be ranked, one compliant and one not, the compliant site will rank higher. Which would you choose if you were employing a designer?
So always check that your sites are compliant. W3C have an excellent free HTML/XHTML Validator you can use.
We at Retiarius Internet Design undertand the importance of this and we produce fully W3C complaint code for our clients as standard.
Related Posts: