Skip to content

Commit

Permalink
chore(docs): Change "whitelist" to "allow list" (#30756)
Browse files Browse the repository at this point in the history
  • Loading branch information
LekoArts committed Apr 8, 2021
1 parent 81ec270 commit e0df4cc
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 7 deletions.
10 changes: 5 additions & 5 deletions docs/docs/resource-handling-and-service-workers.md
Expand Up @@ -27,7 +27,7 @@ It's important that this array is accurate - if a page entered the array when it
When all resources for a page have been successfully prefetched, we do _one_ of the following:

- Add the page's path to a temporary array of prefetched paths, if the service worker has not yet installed
- Send a message to the service worker to let it know to whitelist the page's path, if it is installed
- Send a message to the service worker to let it know to allow the page's path, if it is installed

Upon initial install, we do the following:

Expand All @@ -36,7 +36,7 @@ Upon initial install, we do the following:

Note that in both of the above cases, all these files should have already been downloaded once by the browser, so with [proper HTTP caching setup](/docs/caching/) we don't have to download any of the files again. However, one exception to this is `<style>` elements with a `data-href` attribute (indicating that the embedded stylesheet is the same as the stylesheet at the location specified) - we currently fetch the specified file rather than caching the contents of the element.

Another current problem is that we may start fetching the resources for a page before the service worker has finished installing, but finish fetching them all after it has installed - this could cause a page's path to be whitelisted even if some of its resources haven't been cached (since Gatsby assumes the service worker was installed at the start of fetching resources, if it was installed at the end).
Another current problem is that we may start fetching the resources for a page before the service worker has finished installing, but finish fetching them all after it has installed - this could cause a page's path to be allowed even if some of its resources haven't been cached (since Gatsby assumes the service worker was installed at the start of fetching resources, if it was installed at the end).

## Gatsby Core

Expand Down Expand Up @@ -72,8 +72,8 @@ The following are some invalid reasons why we might not have resources, i.e. thi

### Service worker update handling

The service worker updates automatically when the browser detects that the contents of the `sw.js` file have changed from the currently installed version. Upon an update, we clear all whitelisted paths to prevent old pages from loading after the update.
The service worker updates automatically when the browser detects that the contents of the `sw.js` file have changed from the currently installed version. Upon an update, we clear all allowed paths to prevent old pages from loading after the update.

Blank pages can theoretically occur if we serve the app shell when resources are unavailable - however, this _should_ never occur since we only serve the app shell with whitelisted paths (i.e. ones whose resources have been cached entirely). There may be some edge cases when this can occur, e.g. when the webpack runtime from the old site attempts to load a chunk which is unavailable on the updated site - we are currently investigating ways to prevent this, and make using service workers with Gatsby even more robust.
Blank pages can theoretically occur if we serve the app shell when resources are unavailable - however, this _should_ never occur since we only serve the app shell with allowed paths (i.e. ones whose resources have been cached entirely). There may be some edge cases when this can occur, e.g. when the webpack runtime from the old site attempts to load a chunk which is unavailable on the updated site - we are currently investigating ways to prevent this, and make using service workers with Gatsby even more robust.

We should also never get incorrect 404 pages following a site update, since we never whitelist 404 pages to serve using the offline shell, meaning that a page which was previously a 404 should always load from the server. If it's no longer a 404, then it will be cached as usual.
We should also never get incorrect 404 pages following a site update, since we never allow 404 pages to serve using the offline shell, meaning that a page which was previously a 404 should always load from the server. If it's no longer a 404, then it will be cached as usual.
2 changes: 1 addition & 1 deletion examples/simple-auth/README.md
Expand Up @@ -5,7 +5,7 @@ This is a simplified demo to show how an authentication workflow is implemented
The short version is:

- Gatsby statically renders all unauthenticated routes as usual
- Authenticated routes are whitelisted as client-only
- Authenticated routes are allowed as client-only
- Logged out users are redirected to the login page if they attempt to visit private routes
- Logged in users will see their private content

Expand Down
Expand Up @@ -441,7 +441,7 @@ ${getLowerRequestConcurrencyOptionMessage()}`,
id: CODES.RequestDenied,
context: {
sourceMessage: formatLogMessage(
`${e.message}\n\nThe GraphQL request was forbidden.\nIf you are using a security plugin like WordFence or a server firewall you may need to whitelist your IP address or adjust your firewall settings for your GraphQL endpoint.\n\n${errorContext}`
`${e.message}\n\nThe GraphQL request was forbidden.\nIf you are using a security plugin like WordFence or a server firewall you may need to add your IP address to the allow list or adjust your firewall settings for your GraphQL endpoint.\n\n${errorContext}`
),
},
})
Expand Down

0 comments on commit e0df4cc

Please sign in to comment.