Approach exploits how HTTPS responses are delivered over transmission control protocol.
Read the whole story
Read the whole story
[url=http://arstechnica.com/civis/viewtopic.php?p=31650971#p31650971:exmvkygx said:ylk1[/url]":exmvkygx]Oh. No. We are f*cked.
Now, the only other worse thing I am expecting to hear sometime later is, Google chrome's Auto password fill getting hacked.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:1tpm7e9e said:Samurai Niigel[/url]":1tpm7e9e]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."
NoScript.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651093#p31651093:1ms2x65v said:Aelinsaar[/url]":1ms2x65v][url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:1ms2x65v said:Samurai Niigel[/url]":1ms2x65v]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."
NoScript.
I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:2oe6s2i1 said:AVeryConcernedCitizen[/url]":2oe6s2i1]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
[url=http://arstechnica.com/civis/viewtopic.php?p=31650971#p31650971:xratlub6 said:ylk1[/url]":xratlub6]Oh. No. We are f*cked.
Now, the only other worse thing I am expecting to hear sometime later is, Google chrome's Auto password fill getting hacked.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:schocuvy said:Galeran[/url]":schocuvy]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:3ey7aj0m said:rabish12[/url]":3ey7aj0m]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:3ey7aj0m said:AVeryConcernedCitizen[/url]":3ey7aj0m]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:3a6n7diq said:rabish12[/url]":3a6n7diq]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:3a6n7diq said:AVeryConcernedCitizen[/url]":3a6n7diq]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:3p3n5wcl said:rabish12[/url]":3p3n5wcl]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:3p3n5wcl said:AVeryConcernedCitizen[/url]":3p3n5wcl]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
For the Resource Timing API, that's true. The point I was trying to make was more just that what it's done is exposed an underlying problem rather than being a problem in and of itself, so there's not really anything to "fix" with the API - it's doing what it's meant to do, and it's a flaw in the HTTP/HTTPS side of things that really makes the exploit possible. Basically, it's something that should be fixed properly rather than permanently disabling the API functionality as a stopgap measure around it.[url=http://arstechnica.com/civis/viewtopic.php?p=31651267#p31651267:1qxzn677 said:DaveSimmons[/url]":1qxzn677][url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:1qxzn677 said:rabish12[/url]":1qxzn677]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:1qxzn677 said:AVeryConcernedCitizen[/url]":1qxzn677]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
Only developers doing page optimization or site health checks need this functionality, so these functions could be disabled from working unless a developer command line or menu item was toggled on.
It is, yes. My problem is mostly that it's a bit of a gaffer tape fix that covers the underlying problem up rather than fixing it. Blocking the APIs temporarily might make sense, but the issue that makes them a problem in the first place really needs fixing.[url=http://arstechnica.com/civis/viewtopic.php?p=31651417#p31651417:16yqciu2 said:AVeryConcernedCitizen[/url]":16yqciu2][url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:16yqciu2 said:rabish12[/url]":16yqciu2]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:16yqciu2 said:AVeryConcernedCitizen[/url]":16yqciu2]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
I certainly agree that the root cause here is HTTP/TCP, yet the path of least resistance wouldn't be to change HTTP or TCP (e.g. the BREACH exploit has been around for years with no fix in sight).
I didn't say that APIs don't have gaps or exploits. I said that this particular attack isn't relying on gaps or exploits in the APIs. It's relying on the fact that the APIs do literally the exact thing that they're meant to do and do it correctly.I don't agree that API's don't have gaps or exploits. APIs can and should enforce enforce policies and restrictions that provide a certain level of security and privacy. There may or may not be a way to build limitations into these API's to prevent this from happening, but it would be pretty negligent not to explore this possibility. An outright block to the APIs may be unnecessary at this time, but it shouldn't be off the table, especially if this exploit gains steam.
You either didn't read my post, didn't understand it, or have absolutely no idea why ActiveX was so awful and exploitable.[url=http://arstechnica.com/civis/viewtopic.php?p=31651411#p31651411:1le8i38w said:NorthGuy[/url]":1le8i38w][url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:1le8i38w said:rabish12[/url]":1le8i38w]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:1le8i38w said:AVeryConcernedCitizen[/url]":1le8i38w]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
Are you insane? If the APIs allow functionality that can be exploited in such a catastrophic way then they should be removed. Do you remember ActiveX at all?
[url=http://arstechnica.com/civis/viewtopic.php?p=31651267#p31651267:2qwu841z said:DaveSimmons[/url]":2qwu841z][url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:2qwu841z said:rabish12[/url]":2qwu841z]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:2qwu841z said:AVeryConcernedCitizen[/url]":2qwu841z]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
Only developers doing page optimization or site health checks need this functionality, so these functions could be disabled from working unless a developer command line or menu item was toggled on.
There's a lot of things in most browsers that need to be toggled in order to be used, and they generally have to be toggled through the browser menus or interface in some way so that attackers have no way of accessing them. It's actually the standard way that new standards features are introduced now, with a flag somewhere in the browser enabling them until the standard has been accepted and the feature is ready for full release.[url=http://arstechnica.com/civis/viewtopic.php?p=31651479#p31651479:9ujce8kb said:AVeryConcernedCitizen[/url]":9ujce8kb][url=http://arstechnica.com/civis/viewtopic.php?p=31651267#p31651267:9ujce8kb said:DaveSimmons[/url]":9ujce8kb][url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:9ujce8kb said:rabish12[/url]":9ujce8kb]I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:9ujce8kb said:AVeryConcernedCitizen[/url]":9ujce8kb]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
Only developers doing page optimization or site health checks need this functionality, so these functions could be disabled from working unless a developer command line or menu item was toggled on.
Interesting. Is there functionality today that requires a toggle to be used? If so, I'm assuming these toggles aren't easily turned on (to prevent a third part from turning on "developer" mode and then using the APIs)?
[url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:39hj9b3v said:Samurai Niigel[/url]":39hj9b3v]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."
NoScript.
I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651515#p31651515:2wikd99z said:rabish12[/url]":2wikd99z]There's a lot of things in most browsers that need to be toggled in order to be used, and they generally have to be toggled through the browser menus or interface in some way so that attackers have no way of accessing them. It's actually the standard way that new standards features are introduced now, with a flag somewhere in the browser enabling them until the standard has been accepted and the feature is ready for full release.[url=http://arstechnica.com/civis/viewtopic.php?p=31651479#p31651479:2wikd99z said:AVeryConcernedCitizen[/url]":2wikd99z]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?
Interesting. Is there functionality today that requires a toggle to be used? If so, I'm assuming these toggles aren't easily turned on (to prevent a third part from turning on "developer" mode and then using the APIs)?
I could actually imagine a few uses that would require having the API working in a user's browser (mainly to do with broader gathering of performance metrics), but they aren't exactly pressing. My issue is still just with the idea of disabling an API as the ultimate means of dealing with a problem that it exposes but isn't responsible for, but as you already mentioned progression on changes to HTTP/HTTPS are really slow so I suppose that's not the most practical way to look at it.[url=http://arstechnica.com/civis/viewtopic.php?p=31651603#p31651603:q9g89kzg said:AVeryConcernedCitizen[/url]":q9g89kzg]You're probably spot on with this then, assuming there's really no major use case outside of optimization and/or health checks. Make these APIs inoperable by default.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651253#p31651253:3o5o9sok said:Elgonn[/url]":3o5o9sok][url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:3o5o9sok said:Galeran[/url]":3o5o9sok]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.
The issue with this is that it doesn't also modify traffic generated by the client. You'd need to have random information included in the GET/POST data as well.
How does NoScript compare to uMatrix? Do they largely do the same thing?[url=http://arstechnica.com/civis/viewtopic.php?p=31651093#p31651093:2a7z20s7 said:Aelinsaar[/url]":2a7z20s7][url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:2a7z20s7 said:Samurai Niigel[/url]":2a7z20s7]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."
NoScript.
I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651023#p31651023:ufrvu9y3 said:isparavanje[/url]":ufrvu9y3][url=http://arstechnica.com/civis/viewtopic.php?p=31650971#p31650971:ufrvu9y3 said:ylk1[/url]":ufrvu9y3]Oh. No. We are f*cked.
Now, the only other worse thing I am expecting to hear sometime later is, Google chrome's Auto password fill getting hacked.
While it is a huge vulnerability that is likely to affect many people, I doubt many of us geeks would be affected too much since it relies on third party scripts and I think most computer literate folks use some kind of adblocker.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651651#p31651651:1dimsoa3 said:aaronb1138[/url]":1dimsoa3][url=http://arstechnica.com/civis/viewtopic.php?p=31651253#p31651253:1dimsoa3 said:Elgonn[/url]":1dimsoa3][url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:1dimsoa3 said:Galeran[/url]":1dimsoa3]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.
The issue with this is that it doesn't also modify traffic generated by the client. You'd need to have random information included in the GET/POST data as well.
Some random, high entropy pad information would actually be really easy to add via client and server software updates.
I do find it very funny that I am hearing about a side channel attack which has an extreme amount of overlap with how advertising content operates, on a website which has begged for advertising white listing from it's readers and then dumped stuff like the LG TV ad on them a couple years later.
You're literally telling me that I should continue NOT to unblock anyone's ad content if I want to be responsible and safe in my browsing habits. Thanks for the confirmation, I didn't think it was an accident I had managed nearly 20 years of Internet use without getting malware or viruses without any AV and a tendency not to update browsers, Java, or Flash continuously, all while downloading plenty of content of questionable providence. What has protected me? Good firewall policies and judicious ad-blocking (possibly also ALWAYS show file extensions). Heck, it's the main reason why I used Opera all the way back in the 90's (plus dial-up - had to cut some bandwidth sucking crap out).
[url=http://arstechnica.com/civis/viewtopic.php?p=31651719#p31651719:3peojkaw said:tgbyhn[/url]":3peojkaw]How does NoScript compare to uMatrix? Do they largely do the same thing?[url=http://arstechnica.com/civis/viewtopic.php?p=31651093#p31651093:3peojkaw said:Aelinsaar[/url]":3peojkaw][url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:3peojkaw said:Samurai Niigel[/url]":3peojkaw]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."
NoScript.
I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:3kxz639e said:Galeran[/url]":3kxz639e]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.
[url=http://arstechnica.com/civis/viewtopic.php?p=31651949#p31651949:2arwfvk4 said:Lavonheim[/url]":2arwfvk4][url=http://arstechnica.com/civis/viewtopic.php?p=31651719#p31651719:2arwfvk4 said:tgbyhn[/url]":2arwfvk4]
How does NoScript compare to uMatrix? Do they largely do the same thing?
No, uMatrix is like RequestPolicy and NoScript on steroids; NoScript only lets you enable/disable javascript on specific (sub)domains, RequestPolicy only let's you enable/disable crossdomain resources based on source and destination. uMatrix let's you enable disable per resource category (javascript/css/images/ajax/iframes/etc) per source (sub)domain per destination (sub)domain.
There's also uBlock Origin, which is another ad blocker.
That said, I haven't tried converting Firefox on my main computer to use uMatrix; I only have it on my personal and work laptops.
I can't see any reason why it couldn't be. Expecting web hosts to actually do it isn't a very safe bet, though.[url=http://arstechnica.com/civis/viewtopic.php?p=31652441#p31652441:1w7dhifw said:Rene Gollent[/url]":1w7dhifw]Out of curiosity since I didn't see it mentioned in the article, is there any particular reason this can't just be mitigated the same way CRIME was, ergo by simply disabling compression?