~sircmpwn/gmni-devel

kineto: Serve robots.txt disallowing all robots v1 APPLIED

mbays
mbays: 1
 Serve robots.txt disallowing all robots

 1 files changed, 6 insertions(+), 0 deletions(-)
Export patchset (mbox)
How do I use this?

Copy & paste the following snippet into your terminal to import this patchset into git:

curl -s https://lists.sr.ht/~sircmpwn/gmni-devel/patches/24630/mbox | git am -3
Learn more about email & git
View this thread in the archives

[PATCH kineto] Serve robots.txt disallowing all robots Export this patch

mbays
This overrides any robots.txt file in the proxied gemini capsule, on the
basis that this is intended for gemini robots (which can be expected to
follow the robots.txt companion spec) rather than web robots.

The main purpose though for disallowing web robots is to prevent them
from crawling the proxied cross-site geminispace under /x/, since web
robots won't know even to read the robots.txt files for other capsules
proxied this way.
---
 main.go | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/main.go b/main.go
index c600985..2b227a0 100644
--- a/main.go
+++ b/main.go
@@ -583,6 +583,12 @@ func main() {
			return
		}

		if r.URL.Path == "/robots.txt" {
			w.WriteHeader(http.StatusOK)
			w.Write([]byte("User-agent: *\nDisallow: /\n"))
			return
		}

		req := gemini.Request{}
		req.URL = &url.URL{}
		req.URL.Scheme = root.Scheme
-- 
2.32.0
Thanks!

To git@git.sr.ht:~sircmpwn/kineto
   a8c54c1..988a00f  master -> master