W3C Validation – Is it still Relevant
W3C Validation Tool
W3C (World Wide Web Consortium) the internet standards organisation provides a tool to validate webpages against the prescribed document types e.g. HTML1 transitional, HTML4 strict and others. The validation is of course voluntary.
Web site builders often make use of the validation tool to check the pages they are creating meet the required standard and are free of code errors.
Modern Code not validated
Many of the functions we include in web pages these days does not pass the validation test for the document types (DTD‘s) listed in the tools options.
As an example, I cite a page (Voice Recording Service) from the website I am currently working on – a re-styling of an existing clients site using HTML/CSS (For the techs – document type HTML 1 transitional, CSS 2.1). I regularly use the validation tool to make certain the code is correct – it’s like spell checking a blog article before posting for the common typos).
The pages all get full marks for code (after the typos are fixed), but, and here’s the but… Open Graph meta as well as Facebook like buttons and Google Plus1 do not pass validation.
The same basic page code is used for the terms of business page on this same website. The only difference is all meta open graph tags, as well as all the sharing service links, have been removed. As can be seen, this page validates as XHTML1 perfectly.
How many webpages these days do not at least have a like button for Facebook? Not many.
Open Graph Meta and Facebook Like
There are compelling reasons to add the open graph meta data used by many of the social network sharing buttons. It is quite likely in the not too distant future major search engines like Google will be looking for these tags, as part of the basic requirement. Pages lacking open graph tags may then be penalised.
This is just one of the new web technologies that doesn’t pass validation. There is no point in trying to force the page to validate, the DTD HTML1 HTML etc do not have the parameters for open graph meta incorporated.
So we have a useful tool that helps to check the basic code, but leaves us with a report page full of errors when it come to the other things required for modern web construction. Some webmasters use the report to certify the work done is of a minimum quality. I would not want to show my customers – unless they were literate in web technology – a report containing 10 errors referring to:
og:title = title used in Facebook sharing
Line 14, Column 16: there is no attribute “property”
<meta property=“og:title” content=”Recording Service “/>
Line 113, Column 75: there is no attribute “layout”…k’));</script><fb:like href=”” layout=”box_count” width=”55″ font=”segoe ui”>
These samples are 2 of the 10 ‘errors’ highlighted in the report. These are the code tags as provided by Facebook with no closing: (/>). Note: This version of the like button is not the latest HTML5 code.
The open graph tags used are exactly as provided by Facebook (see my comment below for link to the Facebook page).
So is the requirement for W3C validation still relevant?
There will always be reasons to ensure the code used in web pages is clean and free from errors. Faulty code can prevent search engines from parsing content to some extent, impacting on seo requirements, although modern search algorithms seem able to parse code even with minor errors. One just needs to look at the ranking google gives WordPress.com – 9/10 yet the page does not validate with W3C.
This blog, GraphiclineWeb, also does not validate; this particular article fails with 43 errors and 4 warnings, does not have a Document Type Declaration other than “DOCTYPE html” which caused the validation tool to try to validate using HTML5 standard. Yet posts from this blog regularly get page one in Google SERP, often in the top 3 or 4 positions.
No, errors in validation with regard to these new technologies (open graph and similar) being used everywhere have no bearing on seo. Such inclusions are of more benefit to overall page rating their presence.
Even pages without a DTD are able to achieve good page ranking. Yes, it is bad practice to not include a doctype description at the head of the page, but the question must arise, is it really essential. Modern browsers manage to display web pages correctly even where the standards are not adhered to. And if browsers like IE6/7 are still being used… Chrome and Firefox are free and don’t insist on falling over to compatibility mode if they encounter something unusual, or new. Enough said…
A new W3C Validation Standard Required
There seems little point to me to continue working to archaic standards. What is needed is a validation tool that takes into account modern requirements. HTML5 already has an experimental standard. Is it not time a new version of each of the existing standards was created to include such code as Open Graph Meta, and OAuth, let alone the sharing code used by the worlds most popular social networks.
In some instances it may be possible to get past the problem and achieve valid HTML using CSS, or to redevelop code. This is not always possible. Moving code into CSS may not always be the best option. There is a strong tendency these days to create additional style sheets for everything… Sometimes a simple style statement in the HTML element does the same job more efficiently, with lower server overhead. The use of CSS is often overdone – occasionally simpler IS better.
Quite clearly there are a large number of large and busy web sites where little or no regard has been given to validation. From what I see, sites that achieve validation are very much in the minority – only the most basic and boring pages do so (those I have seen anyway). Is it possibly many site builders no longer find the standard of practical use. Or is it simply that it is just impossible to validate an INTERESTING website against these standards?
How many web builders will take the time to test the code, when they know in advance their will be numerous errors reported, not with the basic code, but with the features and functions they include?
A set of standards that include the mentioned web features will benefit web users and developers alike. I for one would really like to be able to use W3C validation tools to test my work (including Open Graph & sharing code).
Back to the Future
Backward compatibility with outdated internet browsers like IE 5 / 6 and old Netscape Navigator versions has little value in the world today. I am fully aware of the call for accessibility for all internet users. This should be a matter of modern technological development, not one of reliance on applications which should have gone the way of the dinosaurs and become extinct.
The W3C organisation relies on donated funding; should the call not be for support of the organisation by the industry, rather than a demand for backward compatibility? Any support could only successful if the the web industry found genuine value in supporting the organisation.
The call for legislated accessibility for all users of the net does not adequately address the needs of the very users it intends to benefit. Instead it stigmatises them and puts them into a ‘class’ of their own – which amounts to discrimination.
This would be in the interest of all net users, not only site builders.
Finally, from my personal point of view; it will be a great pity if the W3C validation tool, along with the standards themselves, becomes just another unneeded and unused service. The tool alone was useful if nothing else. To remain valid however it needs to take into account modern code.