Skip to content

Comments

Add robots.txt to block crawling of sign-in page#475

Merged
arkadiykondrashov merged 2 commits intomainfrom
session/agent_61652ab6-22c8-40a1-9a89-464fa40ed174
Feb 24, 2026
Merged

Add robots.txt to block crawling of sign-in page#475
arkadiykondrashov merged 2 commits intomainfrom
session/agent_61652ab6-22c8-40a1-9a89-464fa40ed174

Conversation

@kiloconnect
Copy link
Contributor

@kiloconnect kiloconnect bot commented Feb 23, 2026

Adds a public/robots.txt file that instructs all crawlers to not index the /users/sign_in page.

Since this is a Next.js app, files in public/ are served at the root — so this will be available at https://app.kilo.ai/robots.txt.


Built for Brendan by Kilo for Slack

@kiloconnect
Copy link
Contributor Author

kiloconnect bot commented Feb 23, 2026

Code Review Summary

Status: No Issues Found | Recommendation: Merge

Files Reviewed (1 files)
  • public/robots.txt — New file; valid robots.txt format blocking crawlers from /users/sign_in

Copy link
Contributor

@RSO RSO left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like we instead move it to app/robots.txt as well: https://nextjs.org/docs/app/api-reference/file-conventions/metadata/robots not sure what the difference is though

@arkadiykondrashov arkadiykondrashov merged commit 75111c0 into main Feb 24, 2026
12 checks passed
@arkadiykondrashov arkadiykondrashov deleted the session/agent_61652ab6-22c8-40a1-9a89-464fa40ed174 branch February 24, 2026 08:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants