[gopher] New Gopher Proxy

Cameron Kaiser spectre at floodgap.com
Sat Dec 5 23:21:49 UTC 2009


> > If those people are really interested in
> > gopher, then they need to be spidering gopherspace directly.
> 
> Why bother with an HTTP proxy to gopherspace if we want to prevent
> HTTP-based spiders from accessing?   

Actual users, perhaps? ;-)

> What might be useful is to figure out a way to serve up robot.txt to
> HTTP-based spiders to prevent crawling of certain gopherspace pages
> (as you brought up in your previous e-mail).  

I wouldn't be opposed to that idea, but I can't think of an easy way to do
that given the way most proxies are configured and the way most HTTP bots
actually fetch robots.txt.

The other alternative is to have the proxies fetch a robots.txt from the
remote server and then expose it in <meta> tags. But this is certainly not
efficient without caching, and adds complexity.

-- 
------------------------------------ personal: http://www.cameronkaiser.com/ --
  Cameron Kaiser * Floodgap Systems * www.floodgap.com * ckaiser at floodgap.com
-- Predestination was doomed from the start. ----------------------------------



More information about the Gopher-Project mailing list