Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Observatory docs to MDN #33793

Open
wants to merge 59 commits into
base: main
Choose a base branch
from

Conversation

chrisdavidmills
Copy link
Contributor

@chrisdavidmills chrisdavidmills commented May 28, 2024

Description

Mozilla Observatory is being moved to MDN. This PR adds the accompanying cheat sheet docs to MDN, and restructures the existing security docs to make space for them.

Motivation

Additional details

See mdn/mdn#548 for the MDN project tracking item.

Related issues and pull requests

@chrisdavidmills chrisdavidmills requested a review from a team as a code owner May 28, 2024 13:14
@chrisdavidmills chrisdavidmills requested review from Elchi3 and removed request for a team May 28, 2024 13:14
@chrisdavidmills chrisdavidmills marked this pull request as draft May 28, 2024 13:15
@github-actions github-actions bot added Content:Security Security docs size/m 51-500 LoC changed labels May 28, 2024
Copy link
Contributor

github-actions bot commented May 28, 2024

Preview URLs (20 pages)
Flaws (5)

Note! 15 documents with no flaws that don't need to be listed. 🎉

URL: /en-US/docs/Web/HTTP/Headers/X-Content-Type-Options
Title: X-Content-Type-Options
Flaw count: 1

  • broken_links:
    • Can't resolve /en-US/observatory/

URL: /en-US/docs/Web/Security
Title: Security on the web
Flaw count: 1

  • broken_links:
    • Can't resolve /en-US/observatory/

URL: /en-US/docs/Web/Security/Practical_implementation_guides
Title: Practical security implementation guides
Flaw count: 1

  • broken_links:
    • Can't resolve /en-US/observatory/

URL: /en-US/docs/Web/Security/Transport_Layer_Security
Title: Transport Layer Security
Flaw count: 1

  • broken_links:
    • Can't resolve /en-US/observatory/

URL: /en-US/docs/Learn/Server-side/Apache_Configuration_htaccess
Title: Apache Configuration: .htaccess
Flaw count: 1

  • broken_links:
    • Can't resolve /en-US/observatory/
External URLs (27)

URL: /en-US/docs/Web/Security
Title: Security on the web


URL: /en-US/docs/Web/Security/Practical_implementation_guides
Title: Practical security implementation guides


URL: /en-US/docs/Web/Security/Practical_implementation_guides/CORP
Title: Cross-Origin Resource Policy (CORP) implementation


URL: /en-US/docs/Web/Security/Practical_implementation_guides/Turning_off_form_autocompletion
Title: How to turn off form autocompletion


URL: /en-US/docs/Web/Security/Practical_implementation_guides/CSP
Title: Content Security Policy (CSP) implementation


URL: /en-US/docs/Web/Security/Practical_implementation_guides/SRI
Title: Subresource integrity (SRI) implementation


URL: /en-US/docs/Web/Security/Practical_implementation_guides/Clickjacking
Title: Clickjacking prevention


URL: /en-US/docs/Web/Security/Practical_implementation_guides/CORS
Title: Cross-Origin Resource Sharing (CORS) configuration


URL: /en-US/docs/Web/Security/Practical_implementation_guides/Cookies
Title: Secure cookie configuration


URL: /en-US/docs/Web/Security/Practical_implementation_guides/Robots_txt
Title: robots.txt configuration


URL: /en-US/docs/Web/Security/Practical_implementation_guides/TLS
Title: Transport Layer Security (TLS) configuration


URL: /en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention
Title: Cross-site request forgery (CSRF) prevention


URL: /en-US/docs/Web/Security/Transport_Layer_Security
Title: Transport Layer Security

(comment last updated: 2024-06-19 16:29:01)

@github-actions github-actions bot added size/l 501-1000 LoC changed and removed size/m 51-500 LoC changed labels May 28, 2024
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
files/en-us/web/security/index.md Outdated Show resolved Hide resolved
files/en-us/web/security/index.md Outdated Show resolved Hide resolved
chrisdavidmills and others added 3 commits May 28, 2024 14:50
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…lls/content into add-observatory-docs-to-mdn
@github-actions github-actions bot added size/xl >1000 LoC changed and removed size/l 501-1000 LoC changed labels May 30, 2024
chrisdavidmills and others added 10 commits May 30, 2024 17:43
…/index.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…/index.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…x.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…/index.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…x.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…icy/index.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…icy/index.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Copy link
Contributor

This pull request has merge conflicts that must be resolved before it can be merged.

@chrisdavidmills
Copy link
Contributor Author

My preference is to use only the acronym in the sidebar (short-title) and the expanded title on the page, so:

@dipikabh the problem with this is that short titles don't work with this type of sidebar (I think they only work in API sidebars). I tried this out the first time around when I noticed the titles were a bit long in the sidebar.

chrisdavidmills and others added 4 commits June 14, 2024 15:37
…prevention/index.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…ndex.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
| [TLS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#tls_configuration) | Medium | Medium | Yes | Use the most secure [Transport Layer Security](/en-US/docs/Glossary/TLS) (TLS) configuration available for your user base. |
| [Resource loading](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#resource_loading) | Maximum | Low | Yes | Load both passive and active resources via HTTPS. |
| [HTTP redirection](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_redirections) | Maximum | Low | Yes | Websites must redirect to HTTPS; API endpoints should disable HTTP entirely. |
| [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, using HTTP Strict transport security (HSTS). |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, didn't notice the close proximity of "to"s but doesn't seem too bad to me readability-wise

Copy link
Contributor

@dipikabh dipikabh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Preview pages for the last few files did not update, I am guessing because of an incomplete check (check-redirects).
I'm going by your updates/responses and the overall content is looking in good shape. Leaving my +1 here. Let me know if you need me to check anything again. Thanks @chrisdavidmills!

Copy link
Contributor

@dipikabh dipikabh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some links might need to be updated

files/en-us/web/html/attributes/autocomplete/index.md Outdated Show resolved Hide resolved
files/en-us/web/html/attributes/autocomplete/index.md Outdated Show resolved Hide resolved
files/en-us/web/html/element/form/index.md Outdated Show resolved Hide resolved
Comment on lines 17 to 19
Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results.

Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent these sites from appearing in search engine results, it does not secure websites against attackers who can still determine such details because `robots.txt` is publicly accessible.
Copy link
Contributor

@dipikabh dipikabh Jun 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: WDYT about moving around the sentences a bit. The part that it doesn't secure websites should stand out on it's own, possibly can even be highlighted as a note.

Suggested change
Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results.
Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent these sites from appearing in search engine results, it does not secure websites against attackers who can still determine such details because `robots.txt` is publicly accessible.
Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results. Using this file is optional and sites should use it only for these purposes. Don't use `robots.txt` as a way to prevent the disclosure of private information or to hide portions of a website.
While using this file can prevent these sites from appearing in search engine results, it does not secure websites against attackers who can still determine such details because `robots.txt` is publicly accessible.

- `Domain`
- : Cookies should only have a `Domain` set if they need to be accessible on other domains; this should be set to the most restrictive domain possible.
- `Path`
- : Cookies should be set to the most restrictive `Path` possible; for most applications, this will be set to the root directory.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- : Cookies should be set to the most restrictive `Path` possible; for most applications, this will be set to the root directory.
- : Cookies should be set to the most restrictive `Path` possible.

This line is a bit confusing as it starts by saying to set Path to the most restrictive value, but then follows by saying that most of the time it's set to the root directory which is the least restrictive value.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed that this is confusingly-phased. I have made this change in my next commit.


## Problem

Cookies often contain session identifiers or other sensitive information. Unwanted access to cookies, therefore, can cause a host of problems, including [privacy](/en-US/docs/Web/Privacy) issues, ({{Glossary("Cross-site_scripting", "Cross-site scripting (XSS)")}}) attacks, Cross-site request forgery ([CSRF](/en-US/docs/Glossary/CSRF)) attacks, and more.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This phrase

Unwanted access to cookies, therefore, can cause a host of problem

is confusing to me. Maybe change Unwanted to Unauthorized?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is much clearer; updated. Thanks!

- `Path`
- : Cookies should be set to the most restrictive `Path` possible; for most applications, this will be set to the root directory.
- `SameSite`
- : Forbid sending the cookie via cross-origin requests (for example from {{htmlelement("img")}} element), as a strong [anti-CSRF](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) measure. `SameSite` is also useful in protecting against [Clickjacking](/en-US/docs/Glossary/Clickjacking) attacks, in cases that rely on the user being authenticated. You should use one of the following two values:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- : Forbid sending the cookie via cross-origin requests (for example from {{htmlelement("img")}} element), as a strong [anti-CSRF](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) measure. `SameSite` is also useful in protecting against [Clickjacking](/en-US/docs/Glossary/Clickjacking) attacks, in cases that rely on the user being authenticated. You should use one of the following two values:
- : Forbid sending the cookie via cross-origin requests (for example from an {{htmlelement("img")}} element), as a strong [anti-CSRF](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) measure by setting `SameSite` to `Strict`. `SameSite` is also useful in protecting against [Clickjacking](/en-US/docs/Glossary/Clickjacking) attacks, in cases that rely on the user being authenticated. You should use one of the following two values:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've made a couple of updates here, but I've not added "by setting SameSite to Strict" — the end of the paragraph and the following sub-bullets detail which values to use.

Are you saying you think it should always be set to Strict? I thought Lax was OK if Strict caused problems?

Disallow: /
```

Hide certain directories (this is not recommended):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add details as to why it's bad to hide certain directories from a crawler but not the entire site? I don't know the "why" behind this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've looked into this a bit more, and improved the content of the "Solution" section to make this clearer:

Using robots.txt is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them: robots.txt is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are.

Also be aware that some robots will ignore your robots.txt file, for example, malware robots and email address harvesters.


## Problem

Attackers can modify the contents of JavaScript libraries hosted on content delivery networks (CDNs), creating vulnerabilities in all websites that use these libraries.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Attackers can modify the contents of JavaScript libraries hosted on content delivery networks (CDNs), creating vulnerabilities in all websites that use these libraries.
If an attacker exploited a content delivery network (CDN) and modified the contents of JavaScript libraries hosted on that CDN, it would create vulnerabilities in all websites that use those libraries.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated


### Solution

HTTP [`Strict-Transport-Security`](/en-US/docs/Web/HTTP/Headers/Strict-Transport-Security) (HSTS) is an HTTP header that notifies browsers to connect to a given site only over HTTPS, even if the originally specified scheme was HTTP. Browsers with HSTS set for a given site will automatically upgrade all requests to HTTPS. HSTS also tells browsers to treat TLS and certificate-related errors more strictly by disabling the ability to bypass the error page.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
HTTP [`Strict-Transport-Security`](/en-US/docs/Web/HTTP/Headers/Strict-Transport-Security) (HSTS) is an HTTP header that notifies browsers to connect to a given site only over HTTPS, even if the originally specified scheme was HTTP. Browsers with HSTS set for a given site will automatically upgrade all requests to HTTPS. HSTS also tells browsers to treat TLS and certificate-related errors more strictly by disabling the ability to bypass the error page.
HTTP [`Strict-Transport-Security`](/en-US/docs/Web/HTTP/Headers/Strict-Transport-Security) (HSTS) is an HTTP header that notifies browsers to connect to a given site only over HTTPS, even if the originally specified scheme was HTTP. Browsers with HSTS set for a given site will automatically upgrade all requests to HTTPS for that site. HSTS also tells browsers to treat TLS and certificate-related errors more strictly by disabling the ability to bypass the certificate error page.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

- `max-age`
- : Sets the duration, in seconds, for which browsers will redirect to HTTPS.
- `includeSubDomains` {{optional_inline}}
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are upgraded.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are upgraded.
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are also upgraded in addition to `domain.example.com`.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

- `includeSubDomains` {{optional_inline}}
- : Specifies whether browsers should upgrade requests on all subdomains to HTTPS. For example, setting `includeSubDomains` on `domain.example.com` will ensure that requests to `host1.domain.example.com` and `host2.domain.example.com` are upgraded.
- `preload` {{optional_inline}}
- : Specifies whether the site should be preloaded. Including this directive means your site will be included in the [HSTS preload list](https://hstspreload.org/).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- : Specifies whether the site should be preloaded. Including this directive means your site will be included in the [HSTS preload list](https://hstspreload.org/).
- : Specifies whether the site should be preloaded. Including this directive means your site can be included in the [HSTS preload list](https://hstspreload.org/).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated


1. Set a `max-age` value of at least six months (`15768000`). Longer periods, such as two years (`63072000`), are recommended. Once this value is set, the site must continue to support HTTPS until the expiry time is reached.
2. If possible, set `includeSubDomains` to improve security on all subdomains. Careful testing is needed when setting this directive because it could disable sites on subdomains that don't yet have HTTPS enabled.
3. If possible, set `preload` to include your website in the [HSTS preload list](https://hstspreload.org/). Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initial `Strict-Transport-Security` header. This prevents [downgrade attacks](https://en.wikipedia.org/wiki/Downgrade_attack) upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requires `includeSubDomains` to be set and `max-age` to be set to a minimum of 1 year (`31536000`).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
3. If possible, set `preload` to include your website in the [HSTS preload list](https://hstspreload.org/). Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initial `Strict-Transport-Security` header. This prevents [downgrade attacks](https://en.wikipedia.org/wiki/Downgrade_attack) upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requires `includeSubDomains` to be set and `max-age` to be set to a minimum of 1 year (`31536000`).
3. If possible, set `preload` to make it possible to include your website in the [HSTS preload list](https://hstspreload.org/). Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initial `Strict-Transport-Security` header. This prevents [downgrade attacks](https://en.wikipedia.org/wiki/Downgrade_attack) upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requires `includeSubDomains` to be set and `max-age` to be set to a minimum of 1 year (`31536000`).

Just adding preload to your HSTS header doesn't add you to the preload list, you still have to submit your site to the form.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for clarifying that; I wanted to make that clear, but the site wasn't exactly clear about what was required. I have implemented your change and updated the bullet further, to:

  1. If possible, set preload to make it possible to include your website in the HSTS preload list. To add it to the list, visit https://hstspreload.org/ and enter your site URL into the form at the top of the page, fixing any issues that it mentions. Web browsers will perform HTTPS upgrades to preloaded sites before receiving the initial Strict-Transport-Security header. This prevents downgrade attacks upon first use and is recommended for all high-risk websites. Note that being included in the HSTS preload list also requires includeSubDomains to be set and max-age to be set to a minimum of 1 year (31536000).

Does this sound OK?


{{QuickLinksWithSubpages("/en-US/docs/Web/Security")}}

Users frequently input sensitive data on websites, such as names, addresses, and banking details. As a web developer, it's crucial to protect this information from bad actors who use a wide range of exploits to steal such information and use it for personal gain. The focus of [web security](/en-US/docs/Web/Security) is to help you protect your website against these exploits and secure your users' sensitive data.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Users frequently input sensitive data on websites, such as names, addresses, and banking details. As a web developer, it's crucial to protect this information from bad actors who use a wide range of exploits to steal such information and use it for personal gain. The focus of [web security](/en-US/docs/Web/Security) is to help you protect your website against these exploits and secure your users' sensitive data.
Users frequently input sensitive data on websites, such as names, addresses, passwords and banking details. As a web developer, it's crucial to protect this information from bad actors who use a wide range of exploits to steal such information and use it for personal gain. The focus of [web security](/en-US/docs/Web/Security) is to help you protect your website against these exploits and secure your users' sensitive data.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

@chrisdavidmills
Copy link
Contributor Author

@gene1wood thanks for the review! I've fixed most things, but just had a few comments asking for clarification on a few points.

| [TLS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#tls_configuration) | Medium | Medium | Yes | Use the most secure [Transport Layer Security](/en-US/docs/Glossary/TLS) (TLS) configuration available for your user base. |
| TLS: [Resource loading](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#resource_loading) | Maximum | Low | Yes | Load both passive and active resources via HTTPS. |
| TLS: [HTTP redirection](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_redirection) | Maximum | Low | Yes | Websites must redirect to HTTPS; API endpoints should disable HTTP entirely. |
| TLS: [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, using HTTP Strict transport security (HSTS). |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
| TLS: [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, using HTTP Strict transport security (HSTS). |
| TLS: [HSTS implementation](/en-US/docs/Web/Security/Practical_implementation_guides/TLS#http_strict_transport_security_implementation) | High | Low | Yes | Notify user agents to connect to sites only over HTTPS, even if the original scheme chosen was HTTP, by using HTTP Strict transport security (HSTS). |

| [Clickjacking prevention](/en-US/docs/Web/Security/Practical_implementation_guides/Clickjacking) | High | Low | Yes | Control how your site may be framed within an {{htmlelement("iframe")}} to prevent [clickjacking](/en-US/docs/Glossary/Clickjacking). |
| [CSRF prevention](/en-US/docs/Web/Security/Practical_implementation_guides/CSRF_prevention) | High | Unknown | Varies | Protect against [Cross-site request forgery](/en-US/docs/Glossary/CSRF) (CSRF) using `SameSite` cookies and anti-CSRF tokens. |
| [Secure cookie configuration](/en-US/docs/Web/Security/Practical_implementation_guides/Cookies) | High | Medium | Yes | Set all cookies as restrictively as possible. |
| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks using Cross-Origin Resource Policy (CORP). |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks using Cross-Origin Resource Policy (CORP). |
| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks by using Cross-Origin Resource Policy (CORP). |

| [CORP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CORP) | High | Medium | Yes | Protect against speculative side-channel attacks using Cross-Origin Resource Policy (CORP). |
| [MIME type verification](/en-US/docs/Web/Security/Practical_implementation_guides/MIME_types) | Low | Low | No | Verify that all your websites are setting the proper [MIME types](/en-US/docs/Glossary/MIME_type) for all resources. |
| [CSP implementation](/en-US/docs/Web/Security/Practical_implementation_guides/CSP) | High | High | Yes | Provide fine-grained control over where site resources can be loaded from with [Content Security Policy](/en-US/docs/Glossary/CSP) (CSP). |
| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define which non-same origins are allowed to access the content of pages and have resources loaded from them with [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define which non-same origins are allowed to access the content of pages and have resources loaded from them with [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). |
| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define the non-same origins that are allowed to access the content of pages and have resources loaded from them by using [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). |

| [CORS configuration](/en-US/docs/Web/Security/Practical_implementation_guides/CORS) | High | Low | Yes | Define which non-same origins are allowed to access the content of pages and have resources loaded from them with [Cross-Origin Resource Sharing](/en-US/docs/Glossary/CORS) (CORS). |
| [Referrer policy configuration](/en-US/docs/Web/Security/Practical_implementation_guides/Referrer_policy) | Low | Low | Yes | Improve privacy for users and prevent leaking of internal URLs via the {{httpheader("Referer")}} header. |
| [robots.txt configuration](/en-US/docs/Web/Security/Practical_implementation_guides/Robots_txt) | Low | Low | No | Tell robots (such as search engine indexers) how to behave by instructing them not to crawl certain paths on the website. |
| [SRI implementation](/en-US/docs/Web/Security/Practical_implementation_guides/SRI) | Low | Low | No | Verify that fetched resources (for example, from a CDN) are delivered without unexpected manipulation using [Subresource Integrity](/en-US/docs/Glossary/SRI) (SRI). |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
| [SRI implementation](/en-US/docs/Web/Security/Practical_implementation_guides/SRI) | Low | Low | No | Verify that fetched resources (for example, from a CDN) are delivered without unexpected manipulation using [Subresource Integrity](/en-US/docs/Glossary/SRI) (SRI). |
| [SRI implementation](/en-US/docs/Web/Security/Practical_implementation_guides/SRI) | Low | Low | No | Verify that fetched resources (for example, from a CDN) are delivered without unexpected manipulation by using [Subresource Integrity](/en-US/docs/Glossary/SRI) (SRI). |


This page lists guides that detail the best practices for implementing security features on websites. While these guides do not cover all possible security scenarios and cannot guarantee complete security of your website, following the information and best practices in these guides will make your sites significantly more secure.

## Content security fundamentals
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, coming back to these headings. Currently, the H2s seem a bit non-parallel:

# Practical security implementation guides
## Content security fundamentals
## User information security

WDYT about these H2s:

# Practical security implementation guides
## Fundamentals of securing website content
## Securing user information

or

# Practical security implementation guides
## Securing website content
## Securing user information


## Solution

Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site: `same-site` or `same-origin` are recommended.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site: `same-site` or `same-origin` are recommended.
Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site; `same-origin` or `same-site` is recommended.


Use `Cross-Origin-Resource-Policy` to block [`no-cors`](/en-US/docs/Web/API/fetch#mode) cross-origin and/or cross-site requests to the given resource. Use the most restrictive value possible for your site: `same-site` or `same-origin` are recommended.

As this policy is expressed via a response header, the actual request is not prevented — rather, the browser prevents the result from being leaked by stripping the response body.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
As this policy is expressed via a response header, the actual request is not prevented — rather, the browser prevents the result from being leaked by stripping the response body.
As this policy is expressed via a response header, the actual request is not prevented. Instead, the browser prevents the result from being leaked by stripping the response body.


## Examples

Instruct browsers to disallow cross-origin no-cors requests:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or this?

Suggested change
Instruct browsers to disallow cross-origin no-cors requests:
Instruct browsers to disallow cross-origin requests with `no-cors` mode:

## See also

- [`Access-Control-Allow-Origin`](/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Origin)
- [Cross-Origin Resource Sharing (CORS)](/en-US/docs/Web/HTTP/CORS)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CORS can be moved to the end of the list


Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are.

Also be aware that some robots will ignore your `robots.txt` file, for example, malware robots and email address harvesters.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Also be aware that some robots will ignore your `robots.txt` file, for example, malware robots and email address harvesters.
Also be aware that some robots, such as malware robots and email address harvesters, will ignore your `robots.txt` file.


Use `robots.txt` to reduce website load and stop unsuitable content appearing in search results.

Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Using `robots.txt` is optional and sites should use it only for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. In fact, it can help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing attackers exactly where they are.
Using `robots.txt` is optional and should only be used for these purposes. It should not be used as a way to prevent the disclosure of private information or to hide portions of a website. While using this file can prevent pages from appearing in search engine results, it does not secure websites against attackers. On the contrary, it can unintentionally help them: `robots.txt` is publicly accessible, and by adding your sensitive page paths to it, you are showing their locations to potential attackers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Content:HTML Hypertext Markup Language docs Content:HTTP HTTP docs Content:Learn:Django Learning area Django docs Content:Learn Learning area docs Content:Security Security docs size/xl >1000 LoC changed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants