Received: with ECARTIS (v1.0.0; list gopher); Thu, 19 Jun 2003 16:34:39 -0500 (CDT) Return-Path: X-Original-To: gopher@complete.org Delivered-To: gopher@complete.org Received: by gesundheit.complete.org (Postfix, from userid 108) id 80518183208E; Thu, 19 Jun 2003 16:34:22 -0500 (CDT) X-Scanned-By: clamscan at complete.org Received: from heinrich.complete.org (gatekeeper.excelhustler.com [68.99.114.105]) (using TLSv1 with cipher DHE-RSA-AES256-SHA (256/256 bits)) (Client CN "christoph.complete.org", Issuer "John Goerzen -- Root CA" (verified OK)) by gesundheit.complete.org (Postfix) with ESMTP id 871361832095; Thu, 19 Jun 2003 16:33:16 -0500 (CDT) Received: by heinrich.complete.org (Postfix, from userid 1000) id C4E5F339; Thu, 19 Jun 2003 14:53:26 -0500 (CDT) Date: Thu, 19 Jun 2003 14:53:26 -0500 From: John Goerzen To: gopher@complete.org Subject: [gopher] Re: Veronica-2 and robot exclusion Message-ID: <20030619195326.GM6851@complete.org> References: <200306181324.GAA12136@floodgap.com> Mime-Version: 1.0 Content-type: text/plain; charset=us-ascii Content-Disposition: inline In-Reply-To: <200306181324.GAA12136@floodgap.com> User-Agent: Mutt/1.5.4i Content-Transfer-Encoding: 8bit X-archive-position: 769 X-ecartis-version: Ecartis v1.0.0 Sender: gopher-bounce@complete.org Errors-to: gopher-bounce@complete.org X-original-sender: jgoerzen@complete.org Precedence: bulk Reply-to: gopher@complete.org List-help: List-unsubscribe: List-software: Ecartis version 1.0.0 List-Id: Gopher X-List-ID: Gopher List-subscribe: List-owner: List-post: List-archive: X-list: gopher I'm on a train just now and away from the net (horrors!) so I can't check your URLs, but a quick question: Is this the same robots.txt format that is used for the Web? If so, could one potentially use a single robots.txt file in the root of a site that is served up both as Gopher and over HTTP? (As quux.org is, for instance) That would be a very nice feature. On Wed, Jun 18, 2003 at 06:24:18AM -0700, Cameron Kaiser wrote: > Also, I was thinking of making the file ".robots.txt" since many Unix > gophers don't serve dot-files, although there are a growing number of > Windows-hosted gophers and I don't know if it will break these (I don't > do x86 myself). AFAIK, all Unix gophers *will* serve dot files, many just won't include them in directory listings. (Which I think is what you meant anyway)