-
-
Notifications
You must be signed in to change notification settings - Fork 93
feat: disable search engine indexing via configuration #222
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Add DisableSearchIndex configuration option that defaults to true. Using a bool pointer so that we can set the default behavior to place robots.txt even when no configuration is provided. Signed-off-by: Stanley <[email protected]>
Add helper function to create robots.txt unless explicitly disabled with DisableSearchIndex: false and call the function on server init. Signed-off-by: Stanley <[email protected]>
| DisableAppTitle bool `json:"disableAppTitle,omitempty" yaml:"disableAppTitle"` | ||
| DisablePoweredBy bool `json:"disablePoweredBy,omitempty" yaml:"disablePoweredBy"` | ||
| DisableQRSupport bool `json:"disableQRSupport,omitempty" yaml:"disableQRSupport"` | ||
| DisableSearchIndex *bool `json:"disable-search-index" yaml:"disableSearchIndex" default:"true"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The default here has no effect and should not be there.
Instead of doing a double-negation you should just name the parameter EnableSearchIndex with type bool. Saves code and by default is false.
Stick to the naming convention of other parameters, aside json is not needed here as it's not a frontend-facing parameter.
| } | ||
|
|
||
| w.Header().Set("Content-Type", "text/plain; charset=utf-8") | ||
| w.Header().Set("X-Content-Type-Options", "nosniff") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did you add this?
|
|
||
| func handleRobotsTXT(w http.ResponseWriter, _ *http.Request) { | ||
| // If explicitly set to false, do not create robots.txt. | ||
| if cust.DisableSearchIndex != nil && !*cust.DisableSearchIndex { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Simplify: Rename parameter, omit this and deliver an Allow: / when enableSearchIndex: true instead of a 404 to keep it consistent.
Developer Certificate of Origin
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.
PR
Resolves #221
This PR allows users to allow/deny their OTS instance from being indexed by search engines (that abide by robots.txt). Default behavior is to deny indexing for all user agents on all paths. Unless explicitly set (
disableSearchIndex: false), a robots.txt file with the below contents will be served on /robots.txtI git pulled my repo and ran with no issues. Everything compiled with no errors and I was able to
go run . --config config-file.yamlwithdisableSearchIndex : falseto change /robots.txt back to a 404 error.Wiki must be updated with the new configuration option if this PR is merged.