-
Notifications
You must be signed in to change notification settings - Fork 96
Auto-create missing CDN resources on first load #131
Comments
Wouldn't they need to expire some how to avoid loading vulnerable libraries at a later time? |
@L-a-n-g-o-l-i-e-r-s isn't the URL unique to a particular version of the script? |
@gjedeer no idea. |
@gjedeer If the dev did his job properly yes :) |
I mean, at all CDNs I know URLs are like https://somethingsomething.com/jquery-3.2.1.js or /jquery/3.3.3/jquery.js - the URL content is immutable. If you want to use later jQuery you change the link in your index.html, the CDN doesn't update the JS file. That's true as long as we're talking about public CDNs hosting well known libraries (like ajax.googleapis.com), not private CDNs controlled by the website owners (like s3.amazonaws.com). Any counterexamples? |
@gjedeer jsDelivr is a counter example due to their version aliasing feature. |
Damn.
…On December 19, 2016 9:28:43 PM CET, Austin Hartzheim ***@***.***> wrote:
@gjedeer jsDelivr is a counter example due to their version aliasing
feature.
More information about that is available here:
https://github.com/jsdelivr/jsdelivr#version-aliasing
--
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
#131 (comment)
--
Sent from my Android device with K-9 Mail. Please excuse my brevity.
|
Decentraleyes currently has to be manually updated to handle any cases like @austinhartzheim found. If the feature request by @mwarrenus is implemented, a "reload custom resources" item could be added to Decentraleyes to update all resources... even those handled natively by Decentraleyes. |
In the original request @mwarrenus suggest that the add-on would request the CDN for the first time and then save it locally.
So the add-on would basicly first check if the requested ressource has a unique name if it's not have then check the hash. If it's mach, the ressource would be loaded locally. |
I think some special security consideration would have to be made here, because if this is implemented, and the first time the resource is accessed, a middle-man or a hacked site returns malicious code, the permanent caching of this bad code will make the problem very difficult to troubleshoot for the casual user. For this reason, I think it's much safer for all installations to have the same libraries, updated centrally using a well vetted process as is currently the case. If this proposal is implemented anyway, at least consider doing the following:
|
@RoxKilly MitM is a concern, but I don't think a hacked site can do anything malicious in the context of Decentraleyes. If it links to a script from a well-known CDN, we can reasonably trust that the script is safe. If it links to a malicious script, it's most likely not hosted at an open source CDN. Unless you're talking about a scenario where one of the CDNs is hacked in which case, yes, the proposed approach would be problematic. |
I agree with you. my initial post was:
Yes i was talking about a compromised CDN serving a bad file. The scenario would be:
Given what you brought up though, maybe there is no reason to require that the fetching page (Website.com in my example) be an HTTPS page as I originally suggested. |
Checksums for the top 99% of libraries can be stored on a (theoretically) secure server. |
This secure server would be very vulnerable to any hacker attack because it would allow anyone to compromise thousands of users. |
@heubergen Not quite. That would potentially make it a target, but would not necessarily make it vulnerable. All servers are potentially targets. But in this case, it would require breaching two separate servers simultaneously: the theoretical server mentioned above, plus the server containing the actual resource. It's probably more likely that the server storing Decentraleyes itself would be breached. |
@Gitoffthelawn |
@heubergen The rest of your post consisted of well-formed English, so I was unaware that it was not your most proficient language. :) English is a funny language because words that look like they might have similar meanings, actually often do not. Anyways, your point is well taken, and the system would definitely have to accommodate that case. BTW, you are also welcome to add "Please excuse my imperfect use of the English language." at the top of a post. :) |
@Gitoffthelawn could that secure server be github.com? That way anyone could see what checksums are pulled. Additionally, the list could be updated, reviewed, and curated by smart contributors. |
@jawz101 Yes, to my knowledge that would work well. I can't absolutely guarantee it will work, but it is what I recommend trying. |
I think the risk of a compromised library being cached is a rather rare scenario. I think a compromise could be allowing DecentrelEyes maintainers to invalidate cached files, so if a library was found to be tampered with after some caches were made, the maintainers could tell the clients to destroy and re-pull next time. This mechanism isn't really open to abuse either, since it only allows invalidation. |
On the same note, I wonder if there would be any need to review what the libraries actually do. Like instead of blindly hosting their code offer a drop-in replacement with tracking code stripped out (if we came across something like that.) |
sure but it adds a whole new dimension of things to care about and that could go wrong. make generic cdn replacement stable first. there is already enough that could go wrong with different version of a lib. adding a bunch of different modifications on top of each version makes it even harder to find the right one |
Could resources be created automatically by requesting the missing library from the CDN the first time? It seems no worse from a privacy perspective the first time; subsequent requests become private and speedier.
The text was updated successfully, but these errors were encountered: