User-agent: * Disallow: /data Disallow: /feedback/?page Disallow: /galleries/add/*.html Disallow: /journey/*/maplet*.png Disallow: /journey/*/change.html Disallow: /journey/*/return.html Disallow: /journey/*/*.gpx Disallow: /journey/*/*.kml Disallow: /journey/help/osmconversion Disallow: /journey/places/from/ Disallow: /localauthorities/contacts Disallow: /location/*/update.html Disallow: /location/*/delete.html Allow: /location/*/cyclestreets*-size640.jpg Allow: /location/*/cyclestreets*-size1200.jpg Disallow: /location/*/cyclestreets*-size*.jpg Disallow: /photomap/add/?longitude Disallow: /photomap/1 Disallow: /photomap/2 Disallow: /photomap/3 Disallow: /photomap/4 Disallow: /photomap/5 Disallow: /photomap/6 Disallow: /photomap/7 Disallow: /photomap/8 Disallow: /photomap/9 Disallow: /photomap/tags/*/page*.html Disallow: /photos/day Disallow: /photos/page Disallow: /photos/space Disallow: /photos/time Allow: /photos/all/page*.html Disallow: /photos/*/page*.html Disallow: /photos/*/*/page*.html Disallow: /search/?search Disallow: /search/page Disallow: /search/id Disallow: /search/photoId Disallow: /search/caption Disallow: /users/*/*/page Disallow: /users/*/*/id Disallow: /users/*/*/plan Disallow: /users/*/*/name Disallow: /users/*/*/leaving Disallow: /users/*/*/length Disallow: /users/*/*/time Disallow: /users/*/*/quietness Disallow: /users/*/*/hasVideo Disallow: /users/*/*/photoId Disallow: /users/*/*/caption Disallow: /view Disallow: /views/tinkles Disallow: /tiles Disallow: /signin/?/ #Disallow: /journey/places/from/ #Disallow: /journey/places/to/ # Slow down Yandex User-agent: Yandex Crawl-delay: 5 # Unwanted user-agents that just waste bandwidth/CPU User-agent: TurnitinBot User-agent: Nutch Disallow: / # AI scraping is NOT PERMITTED without express permission: see: https://www.cyclestreets.net/privacy/ # We additionally list these known agents, but this is not exhaustive and other AI bots not listed are also not permitted # See: https://neil-clarke.com/block-the-bots-that-feed-ai-models-by-scraping-your-website/ # SeeL https://github.com/ai-robots-txt/ai.robots.txt User-agent: AI2Bot User-agent: Ai2Bot-Dolma User-agent: Amazonbot User-agent: anthropic-ai User-agent: Applebot User-agent: Applebot-Extended User-agent: Bytespider User-agent: CCBot User-agent: ChatGPT-User User-agent: Claude-Web User-agent: ClaudeBot User-agent: cohere-ai User-agent: cohere-training-data-crawler User-agent: Crawlspace User-agent: Diffbot User-agent: DuckAssistBot User-agent: FacebookBot User-agent: FriendlyCrawler User-agent: Google-Extended User-agent: GoogleOther User-agent: GoogleOther-Image User-agent: GoogleOther-Video User-agent: GPTBot User-agent: iaskspider/2.0 User-agent: ICC-Crawler User-agent: ImagesiftBot User-agent: img2dataset User-agent: ISSCyberRiskCrawler User-agent: Kangaroo Bot User-agent: Meta-ExternalAgent User-agent: Meta-ExternalFetcher User-agent: OAI-SearchBot User-agent: omgili User-agent: omgilibot User-agent: PanguBot User-agent: PerplexityBot User-agent: PetalBot User-agent: Scrapy User-agent: SemrushBot-OCOB User-agent: SemrushBot-SWA User-agent: Sidetrade indexer bot User-agent: Timpibot User-agent: VelenPublicWebCrawler User-agent: Webzio-Extended User-agent: YouBot User-agent: VelenPublicWebCrawler User-agent: AliyunSecBot Disallow: /