<ksqif6nobzv734vbbl2defz54lxanvkw2an2klenwamo6l4tt7@s4bzztmyav2u>
Hello! Is there a way to overwrite kineto's handling of it's own robots.txt to use my own robots.txt? On my site nginx redirects requests to /x/ to "/noextproxy.gmi" anyway, and I want to allow some search engines to index at least my own site. So I would use a robots.txt like this: ``` User-agent: * Allow: / Disallow: /x/ ``` Thanks for your help, -fab- -- WWW: https://redterminal.org/ Gemini: gemini://redterminal.org/ Gopher: gopher://redterminal.org/ Keyoxide: https://keyoxide.org/hkp/fab@redterminal.org gpg --auto-key-locate hkps://keys.openpgp.org --locate-keys fab@redterminal.org
<kcd4plt2sqpgc47uw3tcazn4wk4ccyoneqrjrrtyuegas4o4u4@ttrd5tk6vhb2>
<ksqif6nobzv734vbbl2defz54lxanvkw2an2klenwamo6l4tt7@s4bzztmyav2u>
(view parent)
* -fab- <fab@redterminal.org> [2024-09-05 22:23 CEST]: > Hello! > > Is there a way to overwrite kineto's handling of it's own robots.txt to > use my own robots.txt? Never mind. I redirected the "location /robots.txt" to my own robots.txt in NGINX. Problem solved. Have fun, -fab- -- WWW: https://redterminal.org/ Gemini: gemini://redterminal.org/ Gopher: gopher://redterminal.org/ Keyoxide: https://keyoxide.org/hkp/fab@redterminal.org gpg --auto-key-locate hkps://keys.openpgp.org --locate-keys fab@redterminal.org