Skip to content

Conversation

@l3pr-org
Copy link

Developer Certificate of Origin

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or

(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or

(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.

(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.

PR

Resolves #221

This PR allows users to allow/deny their OTS instance from being indexed by search engines (that abide by robots.txt). Default behavior is to deny indexing for all user agents on all paths. Unless explicitly set (disableSearchIndex: false), a robots.txt file with the below contents will be served on /robots.txt

User-agent: *
Disallow: /

I git pulled my repo and ran with no issues. Everything compiled with no errors and I was able to go run . --config config-file.yaml with disableSearchIndex : false to change /robots.txt back to a 404 error.

Wiki must be updated with the new configuration option if this PR is merged.

Add DisableSearchIndex configuration option that defaults to true. Using a bool pointer so that we can set the default behavior to place robots.txt even when no configuration is provided.

Signed-off-by: Stanley <[email protected]>
Add helper function to create robots.txt unless explicitly disabled with DisableSearchIndex: false and call the function on server init.

Signed-off-by: Stanley <[email protected]>
@l3pr-org l3pr-org changed the title Allow disabling of search engine indexing via configuration file Allow to disable search engine indexing via configuration Sep 19, 2025
@l3pr-org l3pr-org changed the title Allow to disable search engine indexing via configuration feat: disable search engine indexing via configuration Sep 25, 2025
DisableAppTitle bool `json:"disableAppTitle,omitempty" yaml:"disableAppTitle"`
DisablePoweredBy bool `json:"disablePoweredBy,omitempty" yaml:"disablePoweredBy"`
DisableQRSupport bool `json:"disableQRSupport,omitempty" yaml:"disableQRSupport"`
DisableSearchIndex *bool `json:"disable-search-index" yaml:"disableSearchIndex" default:"true"`
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The default here has no effect and should not be there.

Instead of doing a double-negation you should just name the parameter EnableSearchIndex with type bool. Saves code and by default is false.

Stick to the naming convention of other parameters, aside json is not needed here as it's not a frontend-facing parameter.

}

w.Header().Set("Content-Type", "text/plain; charset=utf-8")
w.Header().Set("X-Content-Type-Options", "nosniff")
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did you add this?


func handleRobotsTXT(w http.ResponseWriter, _ *http.Request) {
// If explicitly set to false, do not create robots.txt.
if cust.DisableSearchIndex != nil && !*cust.DisableSearchIndex {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Simplify: Rename parameter, omit this and deliver an Allow: / when enableSearchIndex: true instead of a 404 to keep it consistent.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

(Feature) Configurable allow/deny search indexing via robots.txt

2 participants