New attack steals SSNs, e-mail addresses, and more from HTTPS pages

Status
Not open for further replies.
Post content hidden for low score. Show…

isparavanje

Ars Tribunus Angusticlavius
9,188
[url=http://arstechnica.com/civis/viewtopic.php?p=31650971#p31650971:exmvkygx said:
ylk1[/url]":exmvkygx]Oh. No. We are f*cked.
Now, the only other worse thing I am expecting to hear sometime later is, Google chrome's Auto password fill getting hacked.

While it is a huge vulnerability that is likely to affect many people, I doubt many of us geeks would be affected too much since it relies on third party scripts and I think most computer literate folks use some kind of adblocker.
 
Upvote
28 (31 / -3)

Aelinsaar

Ars Tribunus Militum
1,522
[url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:1tpm7e9e said:
Samurai Niigel[/url]":1tpm7e9e]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."

NoScript.

I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.
 
Upvote
31 (31 / 0)
Post content hidden for low score. Show…

fic

Ars Praefectus
3,544
[url=http://arstechnica.com/civis/viewtopic.php?p=31651093#p31651093:1ms2x65v said:
Aelinsaar[/url]":1ms2x65v]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:1ms2x65v said:
Samurai Niigel[/url]":1ms2x65v]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."

NoScript.

I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.

There are definitely some websites out there that I have to find the right combo of permissions to allow to use or sometimes even view. Although most of the time if it takes more than 1-2 tries of enabling partial scripts I just give up and figure the website isn't worth the hassle.
 
Upvote
31 (31 / 0)

rabish12

Ars Legatus Legionis
16,983
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:2oe6s2i1 said:
AVeryConcernedCitizen[/url]":2oe6s2i1]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.
 
Upvote
4 (9 / -5)
[url=http://arstechnica.com/civis/viewtopic.php?p=31650971#p31650971:xratlub6 said:
ylk1[/url]":xratlub6]Oh. No. We are f*cked.
Now, the only other worse thing I am expecting to hear sometime later is, Google chrome's Auto password fill getting hacked.

I have seen the Auto fill saving private information like Date of birth because of poor coding on developer's part.
 
Upvote
-11 (1 / -12)

Elgonn

Wise, Aged Ars Veteran
122
[url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:schocuvy said:
Galeran[/url]":schocuvy]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.

The issue with this is that it doesn't also modify traffic generated by the client. You'd need to have random information included in the GET/POST data as well.
 
Upvote
11 (11 / 0)

DaveSimmons

Ars Tribunus Angusticlavius
9,496
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:3ey7aj0m said:
rabish12[/url]":3ey7aj0m]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:3ey7aj0m said:
AVeryConcernedCitizen[/url]":3ey7aj0m]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

Only developers doing page optimization or site health checks need this functionality, so these functions could be disabled from working unless a developer command line or menu item was toggled on.
 
Upvote
29 (30 / -1)

NorthGuy

Ars Praetorian
453
Subscriptor
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:3a6n7diq said:
rabish12[/url]":3a6n7diq]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:3a6n7diq said:
AVeryConcernedCitizen[/url]":3a6n7diq]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

Are you insane? If the APIs allow functionality that can be exploited in such a catastrophic way then they should be removed. Do you remember ActiveX at all?
 
Upvote
15 (20 / -5)
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:3p3n5wcl said:
rabish12[/url]":3p3n5wcl]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:3p3n5wcl said:
AVeryConcernedCitizen[/url]":3p3n5wcl]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

I certainly agree that the root cause here is HTTP/TCP, yet the path of least resistance wouldn't be to change HTTP or TCP (e.g. the BREACH exploit has been around for years with no fix in sight).

I don't agree that API's don't have gaps or exploits. APIs can and should enforce enforce policies and restrictions that provide a certain level of security and privacy. There may or may not be a way to build limitations into these API's to prevent this from happening, but it would be pretty negligent not to explore this possibility. An outright block to the APIs may be unnecessary at this time, but it shouldn't be off the table, especially if this exploit gains steam.
 
Upvote
23 (23 / 0)

rabish12

Ars Legatus Legionis
16,983
[url=http://arstechnica.com/civis/viewtopic.php?p=31651267#p31651267:1qxzn677 said:
DaveSimmons[/url]":1qxzn677]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:1qxzn677 said:
rabish12[/url]":1qxzn677]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:1qxzn677 said:
AVeryConcernedCitizen[/url]":1qxzn677]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

Only developers doing page optimization or site health checks need this functionality, so these functions could be disabled from working unless a developer command line or menu item was toggled on.
For the Resource Timing API, that's true. The point I was trying to make was more just that what it's done is exposed an underlying problem rather than being a problem in and of itself, so there's not really anything to "fix" with the API - it's doing what it's meant to do, and it's a flaw in the HTTP/HTTPS side of things that really makes the exploit possible. Basically, it's something that should be fixed properly rather than permanently disabling the API functionality as a stopgap measure around it.
 
Upvote
-1 (6 / -7)

rabish12

Ars Legatus Legionis
16,983
[url=http://arstechnica.com/civis/viewtopic.php?p=31651417#p31651417:16yqciu2 said:
AVeryConcernedCitizen[/url]":16yqciu2]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:16yqciu2 said:
rabish12[/url]":16yqciu2]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:16yqciu2 said:
AVeryConcernedCitizen[/url]":16yqciu2]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

I certainly agree that the root cause here is HTTP/TCP, yet the path of least resistance wouldn't be to change HTTP or TCP (e.g. the BREACH exploit has been around for years with no fix in sight).
It is, yes. My problem is mostly that it's a bit of a gaffer tape fix that covers the underlying problem up rather than fixing it. Blocking the APIs temporarily might make sense, but the issue that makes them a problem in the first place really needs fixing.

I don't agree that API's don't have gaps or exploits. APIs can and should enforce enforce policies and restrictions that provide a certain level of security and privacy. There may or may not be a way to build limitations into these API's to prevent this from happening, but it would be pretty negligent not to explore this possibility. An outright block to the APIs may be unnecessary at this time, but it shouldn't be off the table, especially if this exploit gains steam.
I didn't say that APIs don't have gaps or exploits. I said that this particular attack isn't relying on gaps or exploits in the APIs. It's relying on the fact that the APIs do literally the exact thing that they're meant to do and do it correctly.
 
Upvote
10 (11 / -1)

rabish12

Ars Legatus Legionis
16,983
[url=http://arstechnica.com/civis/viewtopic.php?p=31651411#p31651411:1le8i38w said:
NorthGuy[/url]":1le8i38w]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:1le8i38w said:
rabish12[/url]":1le8i38w]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:1le8i38w said:
AVeryConcernedCitizen[/url]":1le8i38w]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

Are you insane? If the APIs allow functionality that can be exploited in such a catastrophic way then they should be removed. Do you remember ActiveX at all?
You either didn't read my post, didn't understand it, or have absolutely no idea why ActiveX was so awful and exploitable.
 
Upvote
0 (7 / -7)
[url=http://arstechnica.com/civis/viewtopic.php?p=31651267#p31651267:2qwu841z said:
DaveSimmons[/url]":2qwu841z]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:2qwu841z said:
rabish12[/url]":2qwu841z]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:2qwu841z said:
AVeryConcernedCitizen[/url]":2qwu841z]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

Only developers doing page optimization or site health checks need this functionality, so these functions could be disabled from working unless a developer command line or menu item was toggled on.


Interesting. Is there functionality today that requires a toggle to be used? If so, I'm assuming these toggles aren't easily turned on (to prevent a third part from turning on "developer" mode and then using the APIs)?
 
Upvote
0 (0 / 0)

rabish12

Ars Legatus Legionis
16,983
[url=http://arstechnica.com/civis/viewtopic.php?p=31651479#p31651479:9ujce8kb said:
AVeryConcernedCitizen[/url]":9ujce8kb]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651267#p31651267:9ujce8kb said:
DaveSimmons[/url]":9ujce8kb]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651159#p31651159:9ujce8kb said:
rabish12[/url]":9ujce8kb]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651039#p31651039:9ujce8kb said:
AVeryConcernedCitizen[/url]":9ujce8kb]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?

Wouldn't the quickest fix be to have browsers reject the Fetch and Resource Timing APIs until those APIs can be updated to prevent this attack?
I don't think the APIs can be updated to prevent this attack, since the core problem is with HTTPS itself.

As I understand it, the APIs are used as tools to more easily fetch information (which is important functionality for developers) and to determine the size and timing of resource retrieval (which is also important for developers). They don't have gaps or exploits that are enabling this attack - they're functioning exactly as intended. The issue (at least from my understanding) is that commonly used HTTP compression enables users to determine whether guesses that they make at a value are accurate by examining the size of the resource that's returned, as larger packages imply an incorrect guess. Browsers could disable the APIs, but then they're mostly just removing a legitimately useful tool - something that allows developers to gauge resource size and transfer times - to try and monkey-patch a deeper issue.

Only developers doing page optimization or site health checks need this functionality, so these functions could be disabled from working unless a developer command line or menu item was toggled on.


Interesting. Is there functionality today that requires a toggle to be used? If so, I'm assuming these toggles aren't easily turned on (to prevent a third part from turning on "developer" mode and then using the APIs)?
There's a lot of things in most browsers that need to be toggled in order to be used, and they generally have to be toggled through the browser menus or interface in some way so that attackers have no way of accessing them. It's actually the standard way that new standards features are introduced now, with a flag somewhere in the browser enabling them until the standard has been accepted and the feature is ready for full release.
 
Upvote
6 (7 / -1)

deus01

Ars Praefectus
3,184
Subscriptor++
[url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:39hj9b3v said:
Samurai Niigel[/url]":39hj9b3v]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."

NoScript.

I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.

Ghostery makes money by selling information on the kinds of things Users block (though this can be disabled if you trust their closed source code). You should consider using Disconnect instead.
 
Upvote
6 (9 / -3)
[url=http://arstechnica.com/civis/viewtopic.php?p=31651515#p31651515:2wikd99z said:
rabish12[/url]":2wikd99z]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651479#p31651479:2wikd99z said:
AVeryConcernedCitizen[/url]":2wikd99z]So isn't this more of a JavaScript issue than a HTTPS issue? Specifically, isn't the true culprit here the newly approved JavaScript API's?



Interesting. Is there functionality today that requires a toggle to be used? If so, I'm assuming these toggles aren't easily turned on (to prevent a third part from turning on "developer" mode and then using the APIs)?
There's a lot of things in most browsers that need to be toggled in order to be used, and they generally have to be toggled through the browser menus or interface in some way so that attackers have no way of accessing them. It's actually the standard way that new standards features are introduced now, with a flag somewhere in the browser enabling them until the standard has been accepted and the feature is ready for full release.

You're probably spot on with this then, assuming there's really no major use case outside of optimization and/or health checks. Make these APIs inoperable by default.
 
Upvote
7 (7 / 0)

rabish12

Ars Legatus Legionis
16,983
[url=http://arstechnica.com/civis/viewtopic.php?p=31651603#p31651603:q9g89kzg said:
AVeryConcernedCitizen[/url]":q9g89kzg]You're probably spot on with this then, assuming there's really no major use case outside of optimization and/or health checks. Make these APIs inoperable by default.
I could actually imagine a few uses that would require having the API working in a user's browser (mainly to do with broader gathering of performance metrics), but they aren't exactly pressing. My issue is still just with the idea of disabling an API as the ultimate means of dealing with a problem that it exposes but isn't responsible for, but as you already mentioned progression on changes to HTTP/HTTPS are really slow so I suppose that's not the most practical way to look at it.

Oh, and it's not both APIs that are a problem. Fetch on its own is perfectly fine, and has a lot of legitimate uses. It's basically just the Resource Timing API that causes the problem from what I can tell.
 
Upvote
4 (5 / -1)
[url=http://arstechnica.com/civis/viewtopic.php?p=31651253#p31651253:3o5o9sok said:
Elgonn[/url]":3o5o9sok]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:3o5o9sok said:
Galeran[/url]":3o5o9sok]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.

The issue with this is that it doesn't also modify traffic generated by the client. You'd need to have random information included in the GET/POST data as well.

Some random, high entropy pad information would actually be really easy to add via client and server software updates.

I do find it very funny that I am hearing about a side channel attack which has an extreme amount of overlap with how advertising content operates, on a website which has begged for advertising white listing from it's readers and then dumped stuff like the LG TV ad on them a couple years later.

You're literally telling me that I should continue NOT to unblock anyone's ad content if I want to be responsible and safe in my browsing habits. Thanks for the confirmation, I didn't think it was an accident I had managed nearly 20 years of Internet use without getting malware or viruses without any AV and a tendency not to update browsers, Java, or Flash continuously, all while downloading plenty of content of questionable providence. What has protected me? Good firewall policies and judicious ad-blocking (possibly also ALWAYS show file extensions). Heck, it's the main reason why I used Opera all the way back in the 90's (plus dial-up - had to cut some bandwidth sucking crap out).
 
Upvote
5 (9 / -4)

tgbyhn

Seniorius Lurkius
19
[url=http://arstechnica.com/civis/viewtopic.php?p=31651093#p31651093:2a7z20s7 said:
Aelinsaar[/url]":2a7z20s7]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:2a7z20s7 said:
Samurai Niigel[/url]":2a7z20s7]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."

NoScript.

I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.
How does NoScript compare to uMatrix? Do they largely do the same thing?
 
Upvote
0 (0 / 0)

bwcbwc

Ars Centurion
276
Subscriptor
[url=http://arstechnica.com/civis/viewtopic.php?p=31651023#p31651023:ufrvu9y3 said:
isparavanje[/url]":ufrvu9y3]
[url=http://arstechnica.com/civis/viewtopic.php?p=31650971#p31650971:ufrvu9y3 said:
ylk1[/url]":ufrvu9y3]Oh. No. We are f*cked.
Now, the only other worse thing I am expecting to hear sometime later is, Google chrome's Auto password fill getting hacked.

While it is a huge vulnerability that is likely to affect many people, I doubt many of us geeks would be affected too much since it relies on third party scripts and I think most computer literate folks use some kind of adblocker.

Workarounds for site developers include padding with random length string, padding to round up to the full MTU size on the network connection, disabling compression during authentication. It's a pretty big hole, but not impossible to fix.
 
Upvote
3 (4 / -1)

J.King

Ars Praefectus
3,824
Subscriptor
[url=http://arstechnica.com/civis/viewtopic.php?p=31651651#p31651651:1dimsoa3 said:
aaronb1138[/url]":1dimsoa3]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651253#p31651253:1dimsoa3 said:
Elgonn[/url]":1dimsoa3]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:1dimsoa3 said:
Galeran[/url]":1dimsoa3]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.

The issue with this is that it doesn't also modify traffic generated by the client. You'd need to have random information included in the GET/POST data as well.


Some random, high entropy pad information would actually be really easy to add via client and server software updates.

I do find it very funny that I am hearing about a side channel attack which has an extreme amount of overlap with how advertising content operates, on a website which has begged for advertising white listing from it's readers and then dumped stuff like the LG TV ad on them a couple years later.

You're literally telling me that I should continue NOT to unblock anyone's ad content if I want to be responsible and safe in my browsing habits. Thanks for the confirmation, I didn't think it was an accident I had managed nearly 20 years of Internet use without getting malware or viruses without any AV and a tendency not to update browsers, Java, or Flash continuously, all while downloading plenty of content of questionable providence. What has protected me? Good firewall policies and judicious ad-blocking (possibly also ALWAYS show file extensions). Heck, it's the main reason why I used Opera all the way back in the 90's (plus dial-up - had to cut some bandwidth sucking crap out).

This is just evidence that Ars can report the news without systemic bias. I consider the discontinuity a good thing.

Edit: quote fail
 
Upvote
16 (16 / 0)

Lavonheim

Wise, Aged Ars Veteran
185
Subscriptor++
[url=http://arstechnica.com/civis/viewtopic.php?p=31651719#p31651719:3peojkaw said:
tgbyhn[/url]":3peojkaw]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651093#p31651093:3peojkaw said:
Aelinsaar[/url]":3peojkaw]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651003#p31651003:3peojkaw said:
Samurai Niigel[/url]":3peojkaw]"The exploit is notable because it doesn't require a man-in-the-middle position. Instead, an end user need only encounter an innocuous-looking JavaScript file hidden in an Web advertisement or hosted directly on a Web page."

NoScript.

I always saw that as a way to enforce my ad-blocking, but now I feel downright clever for being a little on the paranoid side. NoScript, Ghostery, and uBlock Origin make for a very smooth experience. If you don't mind the extremely minimal "hassle" of permissions in NoScript, which is pretty easy.
How does NoScript compare to uMatrix? Do they largely do the same thing?

No, uMatrix is like RequestPolicy and NoScript on steroids; NoScript only lets you enable/disable javascript on specific (sub)domains, RequestPolicy only let's you enable/disable crossdomain resources based on source and destination. uMatrix let's you enable disable per resource category (javascript/css/images/ajax/iframes/etc) per source (sub)domain per destination (sub)domain.

There's also uBlock Origin, which is another ad blocker.

That said, I haven't tried converting Firefox on my main computer to use uMatrix; I only have it on my personal and work laptops.
 
Upvote
0 (1 / -1)
I thought it's been known for a while that compression before encryption weakened the encryption? (In way oversimplified terms, compression tries to find patterns, encryption tries to mask it.)

This is a new way to break encryption that was weakened by prior compression, but isn't the real problem doing the compression on something that's about to be encrypted?
 
Upvote
3 (4 / -1)

bmm6o

Seniorius Lurkius
40
[url=http://arstechnica.com/civis/viewtopic.php?p=31651025#p31651025:3kxz639e said:
Galeran[/url]":3kxz639e]So, as a web developer, if I were to include some <!-- random-length random-data comment --> dynamically generated in all my pages, would that help? At least, those pages that are dynamically generated; those that aren't, aren't going to have PII anyway.

Random padding is almost never the right answer. You'll add noise to the signal, but the signal is still there and can be extracted with more trials. What you can do is add unpredictable but deterministic padding, based e.g. on a MAC of the response.
 
Upvote
5 (5 / 0)

chsnyder

Seniorius Lurkius
23
The obvious workaround is to disable compression on encrypted connections.

It sounds radical and break-the-web and what-about-mobile but it's just text. It's just text!

This entire article and comments is 52KB. The first screenshot alone is 116KB.

I'd rather have a few milliseconds more lag than leaked session tokens, thanks very much.
 
Upvote
13 (13 / 0)

Elgonn

Wise, Aged Ars Veteran
122
[url=http://arstechnica.com/civis/viewtopic.php?p=31651949#p31651949:2arwfvk4 said:
Lavonheim[/url]":2arwfvk4]
[url=http://arstechnica.com/civis/viewtopic.php?p=31651719#p31651719:2arwfvk4 said:
tgbyhn[/url]":2arwfvk4]
How does NoScript compare to uMatrix? Do they largely do the same thing?

No, uMatrix is like RequestPolicy and NoScript on steroids; NoScript only lets you enable/disable javascript on specific (sub)domains, RequestPolicy only let's you enable/disable crossdomain resources based on source and destination. uMatrix let's you enable disable per resource category (javascript/css/images/ajax/iframes/etc) per source (sub)domain per destination (sub)domain.

There's also uBlock Origin, which is another ad blocker.

That said, I haven't tried converting Firefox on my main computer to use uMatrix; I only have it on my personal and work laptops.

They aren't really comparable. uMatrix blocks content. NoScript attempts to manage and leave javascript 'safer'. Sometimes NoScript blocks content. uMatrix is better at blocking general content but that isn't the point of NoScript.

Just stopping scripts from loading isn't the goal. NoScript also protects against attacks that can occur even with javascript is allowed like XSS, ABE, ClearClick or allow you to replace questionable scripts with a friendlier surrogate script that maintains compatibility but removes unwanted behavior.

You shouldn't replace either extension with each other. They're mostly orthogonal.
 
Upvote
3 (3 / 0)

rabish12

Ars Legatus Legionis
16,983
[url=http://arstechnica.com/civis/viewtopic.php?p=31652441#p31652441:1w7dhifw said:
Rene Gollent[/url]":1w7dhifw]Out of curiosity since I didn't see it mentioned in the article, is there any particular reason this can't just be mitigated the same way CRIME was, ergo by simply disabling compression?
I can't see any reason why it couldn't be. Expecting web hosts to actually do it isn't a very safe bet, though.
 
Upvote
-1 (0 / -1)
Status
Not open for further replies.