Last modified: 2013-04-22 16:17:14 UTC
The pages generated by DoubleWiki are index by search engines like Google. So they are duplicates entries for each pages. It may be good to add noindex,nofollow to the page generated by DoubleWiki.
I've push a patch here : https://gerrit.wikimedia.org/r/#/c/12883/
*** Bug 41256 has been marked as a duplicate of this bug. ***
Fix merged.
if in google search سایمون رتل ویکیپدیا it will show you http://fa.wikipedia.org/wiki/%D9%81%DB%8C%D9%84%D8%A7%D8%B1%D9%85%D9%88%D9%86%DB%8C%DA%A9_%D8%A8%D8%B1%D9%84%DB%8C%D9%86?match=en I think it doesn't work properly
Created attachment 11515 [details] google search
The robot policy setup in the past commit is now override by the Article class, so the "noindex, nofollow" isn't outputted. I have made a new patch that fix this issue: https://gerrit.wikimedia.org/r/#/c/38785/ Thanks for the report!
(In reply to comment #6) > The robot policy setup in the past commit is now override by the Article > class, > so the "noindex, nofollow" isn't outputted. I have made a new patch that fix > this issue: https://gerrit.wikimedia.org/r/#/c/38785/ > > Thanks for the report! Merged on the 21st.