[SAC] Trac robots.txt
Frank Warmerdam
warmerdam at pobox.com
Mon Jan 28 16:44:36 PST 2013
Folks,
I did a bit of checking and while Trac was working fine for me this
afternoon I did find that spiders (notably Baidu) was hitting Trac
failure hard. I see the postgis project did not get added to the
robots.txt. I have written a /var/www/trac/mkrobots.sh script that
will produce a robots.txt for *all* Trac projects and run it.
I'll try to check in a few days from now to see how things are going.
Don't hesitate to poke me in IRC if Trac is slow.
I really only want the spiders indexing the wiki and bugs from Trac -
not all the other stuff like views onto svn and timelines.
Best regards,
--
---------------------------------------+--------------------------------------
I set the clouds in motion - turn up | Frank Warmerdam, warmerdam at pobox.com
light and sound - activate the windows | http://pobox.com/~warmerdam
and watch the world go round - Rush | Geospatial Software Developer
More information about the Sac
mailing list