Does Google take account of all the beacons meta? It is undoubtedly traditional but this time the answer comes from Google directly… The occasion to return on each beacon to specify its interest for referencing.

Here thus the complete listing of the beacons meta and their impact on referencing, with indications coming from the article of Google (written by John Mueller, engineer Google in Zurich) supplemented by remarks resulting from my experiment.

Mark out title

Even if it is not a beacon meta, Google confirms that the beacon is used by Google in its algorithm of classification. In short, it is necessary to very carefully choose what one puts at it so that describes the best possible page, the most important words at the beginning.

To my knowledge, the beacon meta title does not exist and is thus not taken into account by the engines:



Mark out meta description

The beacon meta description is taken into account by Google but only for the posting of the results. According to my tests (confirmed officially by Google in the article), this beacon indeed does not have any impact in terms of positioning. Sometimes on the other hand, Google takes it again as a description of result (what is called the snippet) but that depends on the requests (it is posted only if Google considers that it corresponds well to the request).

Mark out meta keywords

It is not used any more by Google, which besides does not even quote it in its article… If you took the trouble to fill it carefully, leave it, it can be useful for other engines or directories (it is about what John Mueller in the comments says).

Mark out meta revisit-after

Even if it exite still many webmasters which believes in it, the beacon meta revisit-after is not taken into account by Google (like by the other main motors according to Google). You can thus remove it your pages… For information, it is supposed to indicate to the robots at the end of how long they must return crawler the page. Google can really parameterize its robot very well so that it adapts its frequency of visits according to the frequency of the updates; so really that obsesses you, you can always define this type of information in the file sitemap.
Mark out meta robots

This beacon is taken into account by Google (and other engines). It is used to define restrictions on the robot which comes crawler the page. The main motors of research also propose with the webmasters to use a specific beacon (googlebot for Google, slurp for Yahoo, etc); in these cases, the restrictions concerned apply only to the specified engine.

Here various possible values for the beacon meta robots, and their significances:

* noindex: indicate to the robot that the page should not be indexed. That does not mean that the robot does not go the crawler: for that it is necessary to use the robots.txt file

* nofollow: indicate to the robot that one should not follow the bonds in the page. That means that Google will not go crawler the pages bound by the page containing this beacon meta robots. Even if Google does not specify it in its article, the engines do not take either account of the bonds present on the page in their algorithm (for example that of PageRank).

* index: indicate to the robot that it can index the page. This value being that by defect, it is completely useless to indicate it.

* follow: indicate to the robot that it can follow the bonds in the page. This value being that by defect, it is completely useless to indicate it.

* Al: this value is the equivalent of index, follow. This value being that by defect, it is completely useless to indicate it.

* nun: this value is the equivalent of noindex, nofollow.

* nosnippet: indicate to the robot that one should not post of description (snippet) in the page of results. I have evil to see the interest for a webmaster to use this possibility since this description makes it possible to encourage the Net surfer to click on the result (perhaps that there exist cases where the description created by the engine is not relevant enough with the eyes of the webmaster).

* noarchive: indicate to the robot that one should not leave the access to the version out of mask. The bond “hides some” in the page of results will thus not be posted. This can be used for those which pass their contents of a public version accessible to a paying filed version (sites of newspapers for example).

* noodp: indicate to the robot that one should not use the data associated with the site by the editors of directory DMOZ (Open Directory Project, ODP). This is useful if the description or the title of the site in DMOZ does not correspond rather well to reality. To more read the article on the beacon meta noodp.

* unavailable_after: [date]: indicate to the robot that the page should not arise in the results after the date indicated. To more read the discussion on the beacon meta unavailable_after

Yahoo also manages the value to noydir which allows as noodp to indicate to the engine that one does not wish that the data of the directory Yahoo (Yahoo Directory) are used. To more read the article on the beacon meta to noydir.

It is possible to combine several values within one only beacon meta robots: it is enough for that to separate the values by commas, such as for example:



Mark out meta google notranslate

This beacon makes it possible to indicate at Google which one does not wish that a "Traduire" bond is posted beside the result of a search. Google posts sometimes this bond giving access to an automatic translation of the page. Google precise in its article that this beacon meta "generally" does not have impact on positioning.

In connection with the languages, John Mueller specifies (in the comments of the blog) that Google does not use the beacons meta to determine in which language the page is written (it manages with statistical analyses of the words present in the contents).
Mark out meta verify-v1

This beacon is useful only for the webmasters which use the tools of Google Webmaster Central, it is not essential for a good referencing. It is used to prove that one is well the webmaster site for which the account Webmasters Tools is parameterized.


Mark out meta HTTP-equiv charset

This beacon is used to specify the character set used for the coding of the page.


Mark out meta HTTP-equiv refresh

At the origin, this beacon makes it possible to reload the page at the end of a time indicated in seconds. Sites of topicality use it for example to force an update of posting in the navigator. That says all the navigators do not manage it inevitably in the same way, and the W3C recommends more to use it.

In the field of referencing, this beacon is sometimes used by some webmasters to redirect the Net surfer towards another page at the end of a certain time (for example 10 seconds). There was not badly abuse with meta refresh configured with a 0 second deadline, which made it possible to hide the satellite contents of pages to the Net surfers.

If you must make a permanent redirection rather, made a redirection waiter with a code 301 than a redirection by meta refresh with 0 second of time (which is not regarded by Google as a true redirection 301).

Conclusion on the meta tags

I repeat simply once more that those which still think that referencing is played on level of the optimization of the beacons meta have several subways of delay… There is full with other things to work on its site before the beacons meta, in particular the netlinking and the creation of original contents!

One will retain despite everything the article that Google is unaware of indeed the beacons meta keywords and revisit-after, and that the meta description does not have indeed impact on positioning.


Like it on Facebook, Tweet it or share this article on other bookmarking websites.

No comments