-
Notifications
You must be signed in to change notification settings - Fork 22.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Observatory docs to MDN #33793
base: main
Are you sure you want to change the base?
Add Observatory docs to MDN #33793
Conversation
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…lls/content into add-observatory-docs-to-mdn
files/en-us/web/security/practical_implementation/clickjacking/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation/clickjacking/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation/clickjacking/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation/cookies/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation/cookies/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation/referrer_policy/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation/referrer_policy/index.md
Outdated
Show resolved
Hide resolved
…/index.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…/index.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…x.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…/index.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…x.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…icy/index.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…icy/index.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
This pull request has merge conflicts that must be resolved before it can be merged. |
@dipikabh the problem with this is that short titles don't work with this type of sidebar (I think they only work in API sidebars). I tried this out the first time around when I noticed the titles were a bit long in the sidebar. |
files/en-us/web/security/practical_implementation_guides/csrf_prevention/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation_guides/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation_guides/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation_guides/tls/index.md
Outdated
Show resolved
Hide resolved
…prevention/index.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…ndex.md Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
files/en-us/web/security/practical_implementation_guides/csp/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation_guides/robots_txt/index.md
Outdated
Show resolved
Hide resolved
files/en-us/web/security/practical_implementation_guides/tls/index.md
Outdated
Show resolved
Hide resolved
| [TLS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#tls_configuration) | Medium | Medium | Yes | Use the most secure [Transport Layer Security](/en-US/docs/Glossary/TLS) (TLS) configuration available for your user base. | | ||
| [Resource loading](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#resource_loading) | Maximum | Low | Yes | Load both passive and active resources via HTTPS. | | ||
| [HTTP redirection](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_redirections) | Maximum | Low | Yes | Websites must redirect to HTTPS; API endpoints should disable HTTP entirely. | | ||
| [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, using HTTP Strict transport security (HSTS). | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, didn't notice the close proximity of "to"s but doesn't seem too bad to me readability-wise
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Preview pages for the last few files did not update, I am guessing because of an incomplete check (check-redirects).
I'm going by your updates/responses and the overall content is looking in good shape. Leaving my +1 here. Let me know if you need me to check anything again. Thanks @chrisdavidmills!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some links might need to be updated
Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results. | ||
|
||
Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent these sites from appearing in search engine results, it does not secure websites against attackers who can still determine such details because `robots.txt` is publicly accessible. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: WDYT about moving around the sentences a bit. The part that it doesn't secure websites should stand out on it's own, possibly can even be highlighted as a note.
Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results. | |
Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent these sites from appearing in search engine results, it does not secure websites against attackers who can still determine such details because `robots.txt` is publicly accessible. | |
Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results. Using this file is optional and sites should use it only for these purposes. Don't use `robots.txt` as a way to prevent the disclosure of private information or to hide portions of a website. | |
While using this file can prevent these sites from appearing in search engine results, it does not secure websites against attackers who can still determine such details because `robots.txt` is publicly accessible. |
files/en-us/web/security/practical_implementation_guides/robots_txt/index.md
Outdated
Show resolved
Hide resolved
- `Domain` | ||
- : Cookies should only have a `Domain` set if they need to be accessible on other domains; this should be set to the most restrictive domain possible. | ||
- `Path` | ||
- : Cookies should be set to the most restrictive `Path` possible; for most applications, this will be set to the root directory. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- : Cookies should be set to the most restrictive `Path` possible; for most applications, this will be set to the root directory. | |
- : Cookies should be set to the most restrictive `Path` possible. |
This line is a bit confusing as it starts by saying to set Path
to the most restrictive value, but then follows by saying that most of the time it's set to the root directory which is the least restrictive value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed that this is confusingly-phased. I have made this change in my next commit.
|
||
## Problem | ||
|
||
Cookies often contain session identifiers or other sensitive information. Unwanted access to cookies, therefore, can cause a host of problems, including [privacy](/en-US/docs/Web/Privacy) issues, ({{Glossary("Cross-site_scripting", "Cross-site scripting (XSS)")}}) attacks, Cross-site request forgery ([CSRF](/en-US/docs/Glossary/CSRF)) attacks, and more. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This phrase
Unwanted access to cookies, therefore, can cause a host of problem
is confusing to me. Maybe change Unwanted
to Unauthorized
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is much clearer; updated. Thanks!
- `Path` | ||
- : Cookies should be set to the most restrictive `Path` possible; for most applications, this will be set to the root directory. | ||
- `SameSite` | ||
- : Forbid sending the cookie via cross-origin requests (for example from {{htmlelement("img")}} element), as a strong [anti-CSRF](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) measure. `SameSite` is also useful in protecting against [Clickjacking](/en-US/docs/Glossary/Clickjacking) attacks, in cases that rely on the user being authenticated. You should use one of the following two values: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- : Forbid sending the cookie via cross-origin requests (for example from {{htmlelement("img")}} element), as a strong [anti-CSRF](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) measure. `SameSite` is also useful in protecting against [Clickjacking](/en-US/docs/Glossary/Clickjacking) attacks, in cases that rely on the user being authenticated. You should use one of the following two values: | |
- : Forbid sending the cookie via cross-origin requests (for example from an {{htmlelement("img")}} element), as a strong [anti-CSRF](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) measure by setting `SameSite` to `Strict`. `SameSite` is also useful in protecting against [Clickjacking](/en-US/docs/Glossary/Clickjacking) attacks, in cases that rely on the user being authenticated. You should use one of the following two values: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've made a couple of updates here, but I've not added "by setting SameSite
to Strict
" — the end of the paragraph and the following sub-bullets detail which values to use.
Are you saying you think it should always be set to Strict
? I thought Lax
was OK if Strict
caused problems?
Disallow: / | ||
``` | ||
|
||
Hide certain directories (this is not recommended): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add details as to why it's bad to hide certain directories from a crawler but not the entire site? I don't know the "why" behind this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've looked into this a bit more, and improved the content of the "Solution" section to make this clearer:
Using
robots.txt
is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them:robots.txt
is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are.Also be aware that some robots will ignore your
robots.txt
file, for example, malware robots and email address harvesters.
|
||
## Problem | ||
|
||
Attackers can modify the contents of JavaScript libraries hosted on content delivery networks (CDNs), creating vulnerabilities in all websites that use these libraries. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Attackers can modify the contents of JavaScript libraries hosted on content delivery networks (CDNs), creating vulnerabilities in all websites that use these libraries. | |
If an attacker exploited a content delivery network (CDN) and modified the contents of JavaScript libraries hosted on that CDN, it would create vulnerabilities in all websites that use those libraries. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
|
||
### Solution | ||
|
||
HTTP [`Strict-Transport-Security`](/en-US/docs/Web/HTTP/Headers/Strict-Transport-Security) (HSTS) is an HTTP header that notifies browsers to connect to a given site only over HTTPS, even if the originally specified scheme was HTTP. Browsers with HSTS set for a given site will automatically upgrade all requests to HTTPS. HSTS also tells browsers to treat TLS and certificate-related errors more strictly by disabling the ability to bypass the error page. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
HTTP [`Strict-Transport-Security`](/en-US/docs/Web/HTTP/Headers/Strict-Transport-Security) (HSTS) is an HTTP header that notifies browsers to connect to a given site only over HTTPS, even if the originally specified scheme was HTTP. Browsers with HSTS set for a given site will automatically upgrade all requests to HTTPS. HSTS also tells browsers to treat TLS and certificate-related errors more strictly by disabling the ability to bypass the error page. | |
HTTP [`Strict-Transport-Security`](/en-US/docs/Web/HTTP/Headers/Strict-Transport-Security) (HSTS) is an HTTP header that notifies browsers to connect to a given site only over HTTPS, even if the originally specified scheme was HTTP. Browsers with HSTS set for a given site will automatically upgrade all requests to HTTPS for that site. HSTS also tells browsers to treat TLS and certificate-related errors more strictly by disabling the ability to bypass the certificate error page. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
- `max-age` | ||
- : Sets the duration, in seconds, for which browsers will redirect to HTTPS. | ||
- `includeSubDomains` {{optional_inline}} | ||
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are upgraded. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are upgraded. | |
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are also upgraded in addition to `domain.example.com`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
- `includeSubDomains` {{optional_inline}} | ||
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are upgraded. | ||
- `preload` {{optional_inline}} | ||
- : Specifies whether the site should be preloaded. Including this directive means your site will be included in the [HSTS preload list](https://hstspreload.org/). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- : Specifies whether the site should be preloaded. Including this directive means your site will be included in the [HSTS preload list](https://hstspreload.org/). | |
- : Specifies whether the site should be preloaded. Including this directive means your site can be included in the [HSTS preload list](https://hstspreload.org/). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
|
||
1. Set a `max-age` value of at least six months (`15768000`). Longer periods, such as two years (`63072000`), are recommended. Once this value is set, the site must continue to support HTTPS until the expiry time is reached. | ||
2. If possible, set `includeSubDomains` to improve security on all subdomains. Careful testing is needed when setting this directive because it could disable sites on subdomains that don't yet have HTTPS enabled. | ||
3. If possible, set `preload` to include your website in the [HSTS preload list](https://hstspreload.org/). Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initial `Strict-Transport-Security` header. This prevents [downgrade attacks](https://en.wikipedia.org/wiki/Downgrade_attack) upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requires `includeSubDomains` to be set and `max-age` to be set to a minimum of 1 year (`31536000`). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
3. If possible, set `preload` to include your website in the [HSTS preload list](https://hstspreload.org/). Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initial `Strict-Transport-Security` header. This prevents [downgrade attacks](https://en.wikipedia.org/wiki/Downgrade_attack) upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requires `includeSubDomains` to be set and `max-age` to be set to a minimum of 1 year (`31536000`). | |
3. If possible, set `preload` to make it possible to include your website in the [HSTS preload list](https://hstspreload.org/). Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initial `Strict-Transport-Security` header. This prevents [downgrade attacks](https://en.wikipedia.org/wiki/Downgrade_attack) upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requires `includeSubDomains` to be set and `max-age` to be set to a minimum of 1 year (`31536000`). |
Just adding preload to your HSTS header doesn't add you to the preload list, you still have to submit your site to the form.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for clarifying that; I wanted to make that clear, but the site wasn't exactly clear about what was required. I have implemented your change and updated the bullet further, to:
- If possible, set
preload
to make it possible to include your website in the HSTS preload list. To add it to the list, visit https://hstspreload.org/ and enter your site URL into the form at the top of the page, fixing any issues that it mentions. Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initialStrict-Transport-Security
header. This prevents downgrade attacks upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requiresincludeSubDomains
to be set andmax-age
to be set to a minimum of 1 year (31536000
).
Does this sound OK?
|
||
{{QuickLinksWithSubpages("/en-US/docs/Web/Security")}} | ||
|
||
Users frequently input sensitive data on websites, such as names, addresses, and banking details. As a web developer, it's crucial to protect this information from bad actors who use a wide range of exploits to steal such information and use it for personal gain. The focus of [web security](/en-US/docs/Web/Security) is to help you protect your website against these exploits and secure your users' sensitive data. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Users frequently input sensitive data on websites, such as names, addresses, and banking details. As a web developer, it's crucial to protect this information from bad actors who use a wide range of exploits to steal such information and use it for personal gain. The focus of [web security](/en-US/docs/Web/Security) is to help you protect your website against these exploits and secure your users' sensitive data. | |
Users frequently input sensitive data on websites, such as names, addresses, passwords and banking details. As a web developer, it's crucial to protect this information from bad actors who use a wide range of exploits to steal such information and use it for personal gain. The focus of [web security](/en-US/docs/Web/Security) is to help you protect your website against these exploits and secure your users' sensitive data. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
@gene1wood thanks for the review! I've fixed most things, but just had a few comments asking for clarification on a few points. |
| [TLS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#tls_configuration) | Medium | Medium | Yes | Use the most secure [Transport Layer Security](/en-US/docs/Glossary/TLS) (TLS) configuration available for your user base. | | ||
| TLS: [Resource loading](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#resource_loading) | Maximum | Low | Yes | Load both passive and active resources via HTTPS. | | ||
| TLS: [HTTP redirection](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_redirection) | Maximum | Low | Yes | Websites must redirect to HTTPS; API endpoints should disable HTTP entirely. | | ||
| TLS: [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, using HTTP Strict transport security (HSTS). | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| TLS: [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, using HTTP Strict transport security (HSTS). | | |
| TLS: [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, by using HTTP Strict transport security (HSTS). | |
| [Clickjacking prevention](/en-US/docs/Web/Security/Practical_implementation_guides/Clickjacking) | High | Low | Yes | Control how your site may be framed within an {{htmlelement("iframe")}} to prevent [clickjacking](/en-US/docs/Glossary/Clickjacking). | | ||
| [CSRF prevention](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) | High | Unknown | Varies | Protect against [Cross-site request forgery](/en-US/docs/Glossary/CSRF) (CSRF) using `SameSite` cookies and anti-CSRF tokens. | | ||
| [Secure cookie configuration](/en-US/docs/Web/Security/Practical_implementation_guides/Cookies) | High | Medium | Yes | Set all cookies as restrictively as possible. | | ||
| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks using Cross-Origin Resource Policy (CORP). | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks using Cross-Origin Resource Policy (CORP). | | |
| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks by using Cross-Origin Resource Policy (CORP). | |
| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks using Cross-Origin Resource Policy (CORP). | | ||
| [MIME type verification](/en-US/docs/Web/Security/Practical_implementation_guides/MIME_types) | Low | Low | No | Verify that all your websites are setting the proper [MIME types](/en-US/docs/Glossary/MIME_type) for all resources. | | ||
| [CSP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CSP) | High | High | Yes | Provide fine-grained control over where site resources can be loaded from with [Content Security Policy](/en-US/docs/Glossary/CSP) (CSP). | | ||
| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define which non-same origins are allowed to access the content of pages and have resources loaded from them with [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define which non-same origins are allowed to access the content of pages and have resources loaded from them with [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). | | |
| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define the non-same origins that are allowed to access the content of pages and have resources loaded from them by using [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). | |
| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define which non-same origins are allowed to access the content of pages and have resources loaded from them with [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). | | ||
| [Referrer policy configuration](/en-US/docs/Web/Security/Practical_implementation_guides/Referrer_policy) | Low | Low | Yes | Improve privacy for users and prevent leaking of internal URLs via the {{httpheader("Referer")}} header. | | ||
| [robots.txt configuration](/en-US/docs/Web/Security/Practical_implementation_guides/Robots_txt) | Low | Low | No | Tell robots (such as search engine indexers) how to behave by instructing them not to crawl certain paths on the website. | | ||
| [SRI implementation](/en-US/docs/Web/Security/Practical_implementation_guides/SRI) | Low | Low | No | Verify that fetched resources (for example, from a CDN) are delivered without unexpected manipulation using [Subresource Integrity](/en-US/docs/Glossary/SRI) (SRI). | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| [SRI implementation](/en-US/docs/Web/Security/Practical_implementation_guides/SRI) | Low | Low | No | Verify that fetched resources (for example, from a CDN) are delivered without unexpected manipulation using [Subresource Integrity](/en-US/docs/Glossary/SRI) (SRI). | | |
| [SRI implementation](/en-US/docs/Web/Security/Practical_implementation_guides/SRI) | Low | Low | No | Verify that fetched resources (for example, from a CDN) are delivered without unexpected manipulation by using [Subresource Integrity](/en-US/docs/Glossary/SRI) (SRI). | |
|
||
This page lists guides that detail the best practices for implementing security features on websites. While these guides do not cover all possible security scenarios and cannot guarantee complete security of your website, following the information and best practices in these guides will make your sites significantly more secure. | ||
|
||
## Content security fundamentals |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, coming back to these headings. Currently, the H2s seem a bit non-parallel:
# Practical security implementation guides
## Content security fundamentals
## User information security
WDYT about these H2s:
# Practical security implementation guides
## Fundamentals of securing website content
## Securing user information
or
# Practical security implementation guides
## Securing website content
## Securing user information
|
||
## Solution | ||
|
||
Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site: `same-site` or `same-origin` are recommended. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site: `same-site` or `same-origin` are recommended. | |
Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site; `same-origin` or `same-site` is recommended. |
|
||
Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site: `same-site` or `same-origin` are recommended. | ||
|
||
As this policy is expressed via a response header, the actual request is not prevented — rather, the browser prevents the result from being leaked by stripping the response body. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As this policy is expressed via a response header, the actual request is not prevented — rather, the browser prevents the result from being leaked by stripping the response body. | |
As this policy is expressed via a response header, the actual request is not prevented. Instead, the browser prevents the result from being leaked by stripping the response body. |
|
||
## Examples | ||
|
||
Instruct browsers to disallow cross-origin no-cors requests: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or this?
Instruct browsers to disallow cross-origin no-cors requests: | |
Instruct browsers to disallow cross-origin requests with `no-cors` mode: |
## See also | ||
|
||
- [`Access-Control-Allow-Origin`](/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Origin) | ||
- [Cross-Origin Resource Sharing (CORS)](/en-US/docs/Web/HTTP/CORS) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CORS can be moved to the end of the list
|
||
Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are. | ||
|
||
Also be aware that some robots will ignore your `robots.txt` file, for example, malware robots and email address harvesters. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also be aware that some robots will ignore your `robots.txt` file, for example, malware robots and email address harvesters. | |
Also be aware that some robots, such as malware robots and email address harvesters, will ignore your `robots.txt` file. |
|
||
Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results. | ||
|
||
Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are. | |
Using `robots.txt` is optional and should only be used for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. On the contrary, it can unintentionally help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing their locations to potential attackers. |
Description
Mozilla Observatory is being moved to MDN. This PR adds the accompanying cheat sheet docs to MDN, and restructures the existing security docs to make space for them.
Motivation
Additional details
See mdn/mdn#548 for the MDN project tracking item.
Related issues and pull requests