82

On Friday I had a working dev environment. On Monday I had a broken one. I encountered this error message in the Chrome dev-tools console for all my assets:

Access to CSS stylesheet at 'http://localhost:8080/build/app.css' from origin 'http://example.com' has been blocked by CORS policy: The request client is not a secure context and the resource is in more-private adress space local.

Is there any quick fix for this? I tried setting access-control-allow-origin in my webpack devServer.headers config to no avail:

config.devServer.headers = {
  'Access-Control-Allow-Origin': '*',
  'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept'
}
0

9 Answers 9

110

Original Answer

I finally found the answer, in this RFC about CORS-RFC1918 from a Chrome-team member. To sum it up, Chrome has implemented CORS-RFC1918, which prevents public network resources from requesting private-network resources - unless the public-network resource is secure (HTTPS) and the private-network resource provides appropriate (yet-undefined) CORS headers.

There's also a Chrome flag you can change to disable the new behavior for now: chrome://flags/#block-insecure-private-network-requests

Disabling that flag does mean you're re-opening the security hole that Chrome's new behavior is meant to close.


Update 2021: A few months after I posted this question, the flag I referenced in my original answer was removed, and instead of disabling a security feature I was forced to solve the problem more satisfactorily by serving assets over HTTPS.

Update 2022: Chrome 98 is out, and it introduces support for Preflight requests. According to the announcement, failed requests are supposed to produce a warning and have no other effect, but in my case they are full errors that break my development sites. So I had to add middleware to teach webpack-dev-server how to serve preflight requests.

Private Network Access (formerly CORS-RFC1918) is a specification that forbids requests from less private network resources to more private network resources. Like HTTP to HTTPS, or a remote host to localhost.

The ultimate solution was to add a self-signed certificate and middleware which enabled requests from my remote dev server to my localhost webpack-dev-server for assets.

Generate certificates

cd path/to/.ssl
npx mkcert create-cert

Configure webpack-dev-server to use certificates and serve preflight requests

module.exports = {
  //...
  devServer: {
    https: {
      key: readFileSync("./.ssl/cert.key"),
      cert: readFileSync("./.ssl/cert.crt"),
      cacert: readFileSync("./.ssl/ca.crt"),
    },

    allowedHosts: ".example.dev", // should match host in origin below
    setupMiddlewares(middlewares, devServer) {
      // Serve OPTIONS requests
      devServer.app.options('*', (req, res) => {
        // Only serve if request has expected origin header
        if (/^https:\/\/example\.dev$/.test(req.headers.origin)) {
          res.set({
            "Access-Control-Allow-Credentials": "true",
            "Access-Control-Allow-Private-Network": "true",
            // Using * results in error if request includes credentials
            "Access-Control-Allow-Origin": req.headers.origin,
          })

          res.sendStatus(200)
        }
      }

      return middlewares
    }
  }
}

Trust certificates

  1. Right click ca.crt in Windows Explorer and select Install Certificate
  2. Select Current User.
  3. Choose Place all certificates in the following store, then Browse..., and select Trusted Root Certification Authorities.
  4. Finish.

Firefox-specific instructions

Firefox doesn't respect your authoritah! by default. Configure it to do so with these steps:

  1. Type about:config into the address bar
  2. Search for security.enterprise_roots.enabled
  3. Toggle the setting to true
12
  • 1
    I would love to see the exact rules for this. I've got hit by this too, but the "private" server was the web server including the resource (it was on a publicly-allocated IP block but not externally routable), and the resource was a bootstrap.js hosted on cloudflare. My understanding is that it should block resources loaded from "more private" endpoints and I hardly see how couldflate could be considered more private in this regard. Could it be considering the proxy address rather than the DNS resolution for the target? Commented Sep 22, 2021 at 7:57
  • 1
    Is your private server http and cloudflare https?
    – tvanc
    Commented Sep 22, 2021 at 15:25
  • Yes indeed and neither are under my control... The error message lacks clarity imho, so apparently they consider an https connection more private than an http connection. Is that consideration taking priority over private vs. public IP addresses though? Remember my "private" host is still using a public ip block, just not routable externally. Those are two valid yet different definitions of "private". Commented Sep 25, 2021 at 17:41
  • 1
    Self-Signed-Cert is no solution, the browser does not accept those out of the box. Dummy Extranet-Domain-Cert (via some Domain on Internet re-used for the Extranet-Server) is no solution, the Extranet-Server has a (very fixed, very hardcoded) IP (only accessible via VPN). Creating my own CA in the browser is even more insecure, as I then have to protect this CA key, which is complete overkill for such a simple problem. Hence the Extranet ressource must stay http://IP. But it still must be able to access some well defined http://127.* (which are secured against abuse). Checkmate?
    – Tino
    Commented Oct 4, 2021 at 13:25
  • @tino regarding self-signed certs, in Windows you can right click a .crt file, select Install Certificate, and install it to Trusted Root Certification Authorities. In Firefox, you then have to go to about:config and set security.enterprise_roots.enabled to true. All other browsers accept your trusted authorities automatically in my experience. You may have to restart.
    – tvanc
    Commented Oct 4, 2021 at 17:13
67

just a Chrome client way to ignore this warning and make assets accessable:

1: go to chrome://flags/#block-insecure-private-network-requests

2: set Block insecure private network requests to Disabled

Note: this just works fine when you're in your own computer or your dev environment

enter image description here

7
  • This expose to any real security risk?
    – Arnold Roa
    Commented Mar 20, 2022 at 11:26
  • 4
    2022 doesnt work anymore :/
    – strix25
    Commented Apr 13, 2022 at 15:15
  • 3
    28-Sep-2022 still it is working Commented Sep 28, 2022 at 9:27
  • 4
    2-Dec-2022 -> Not Working -> Chrome Version: 108.0.5359.72 -> Temporary work around stackoverflow.com/a/74651311/1019435 Commented Dec 2, 2022 at 5:36
  • 6
    jan-2023 and still working Commented Jan 11, 2023 at 1:35
9

I was able to allow requests from localhost to localhost with setting one new server header to preflight and usual requests:

Access-Control-Allow-Private-Network: true

Source:
https://web.dev/cors-rfc1918-feedback/#step-2:-sending-preflight-requests-with-a-special-header

2
  • 31
    Showing how or where you set this header would make this answer more useful.
    – lowcrawler
    Commented Apr 28, 2022 at 4:51
  • I already have added this, but it does not work. Probably because the external site (which I do not control) is HTTP only, so Chrome does not even do preflight in my case. (The problem is not the script but the sitescript injected by some extension. So I have to route the request via background scripts. Sigh.)
    – Tino
    Commented Dec 4, 2023 at 12:03
2

While it is a good thing that Chrome now protects users from cross-site request forgery (CSRF) attacks targeting routers and other devices on private networks, it also means that legitimate applications, namely business applications, that rely on cross-site requests to resources on private networks are negatively affected and need to be changed. In my company, we maintain a web application that is exposed publicly through HTTPs and calls a web service on label printers on the client's private network. We ended up developing a proxy that accepts web service requests on a public and secure endpoint, and forwards them to the web service on the private network. We are now making this proxy available for others to use: https://p2prox.io/

1
  • As it’s currently written, your answer is unclear. Please edit to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers in the help center.
    – Community Bot
    Commented May 24, 2022 at 1:32
2

December 2022 update

I think they renamed the flag ...

  • Try chrome://flags/#allow-insecure-localhost
  • Set to Enabled
  • Restart Chrome
1

Just came across this subject, since I had the same problem with a webserver instance in our local network. This is not necessarily a complex problem. It may happen, e.g. if you include javascript libraries from public resources, such as vue.js or node.js. To avoid this in a local network, store a copy of the library on your local server and reference it in your web pages. E.g. instead of using:

<script src="https://cdnjs.cloudflare.com/ajax/libs/vue/3.2.31/vue.global.min.js"></script>

use

<script src="./lib/vue.global.min.js"></script>
1

Temporary work around.

In Windows command prompt run the below command and restart the chrome.

reg add HKEY_CURRENT_USER\SOFTWARE\Policies\Google\Chrome /t REG_DWORD /v InsecurePrivateNetworkRequestsAllowed /d 1 /f

The above command will create the following entry in windows registry.

enter image description here

Reference:

https://developer.chrome.com/blog/private-network-access-update/

https://chromeenterprise.google/policies/#InsecurePrivateNetworkRequestsAllowed

0

Access to CSS stylesheet at 'http://sub.domain.com/font/Sahel.css' from origin 'http://sub.domain.com' has been blocked by CORS policy: The request client is not a secure context and the resource is in more-private address space private.

I'm developing a web-base system for the company I work in, and we have set up the dns and domain to access the system locally while we are inside the company and access it through internet while we are not there. no https cer was installed ever.

Above quote shows up from time to time and refers to same domain as one in a private level and the other as a less private! and it will be fixed by Ctrl + F5. so ridiculous!

In my case, adding a dynamic version using ?v=time() at the end of ALL OF MY LOCAL LINKS fixed my problem, but it costs downloading all scripts, css, fonts everytime user load the page!

0

As of April 2024

A new W3C draft proposes a new setting in the fetch options named targetAddressSpace.

It can be used to allow mixed-content requests from public to private address spaces, like this:

fetch("http://router.local/ping", {
    targetAddressSpace: "private",
});

Currently only supported since Chrome 124.

Reference:

Not the answer you're looking for? Browse other questions tagged or ask your own question.