I think some people on the SVG design committee were aiming to replace Flash for things like browser games, and wanted animations and javascript and so on to support that role.
That lead to the weird situation where browsers have two ways of embedding an SVG into a web page - embed in an <img> tag and the javascript won't run, but embed it in an <iframe> and it will (but of course iframe height can't auto-size...)
The javascript also means pretty much no user-generated-content sites allow the upload of SVGs. Wikipedia is the only place I can think of - and even they serve the SVG as a PNG almost everywhere.
I want to make something clear: <img src=file.svg> will not execute any scripts nor load foreignObject.
For this to work, the SVG has to be loaded in an iframe or in a new tab. Or it could be inlined in the HTML.
Nothing special about SVG really as long as you (Facebook) treat SVG files as images and don't inline it.
The SVG part only really comes in as a way to hide script tags from anyone looking at the network requests, but even then it seems that the specific SVG was crafted to download further scripts.
So what's the issue here exactly? It seems that Facebook is still somehow affected by XSS? Neat.
The Malwarebytes article[1] explains that the users downloaded (possibly automatically) the SVG files and then opened them in the default viewer, which is MS Edge.
All the more reason to block all JS by default with add-ons like NoScript or uBO and manage a whitelist.
It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
This used to be the case many years ago. But these days practically every site pulls in content from several other sites, sometimes dozens. Fine tuning noscript to get such a site to work without obscure breakage will take a long time of trial & error, reloading again & again. Now consider that you've to do this for every one of your regular sites.
Noscript is just too painful for people who want to just browse the web. Its the gentoo of browser extensions. People with massive time & patience can do it yes, but the rest of us are best served by uBlock & standard browser protections.
> But these days practically every site pulls in content from several other sites, sometimes dozens.
They do, but as a long-time NoScript user I can tell you from personal experience that this content rarely does anything important, and leaving it out often improves your UX. Problems like you describe pop up... from time to time, for individual sites, maybe a few times a year, and definitely not on "regular sites".
I have an entirely separate browser profile for when I absolutely need to use a site and just don't have the time to tinker around with script permissions.
> Fine tuning noscript to get such a site to work without obscure breakage will take a long time of trial & error, reloading again & again. Now consider that you've to do this for every one of your regular sites.
No, not really. Usually just the top-level domain is enough. Very occasionally a site will have some other domain they serve from, and it's usually obvious which one to allowlist. It takes like, ten seconds, and you only need to do it once per domain if you make the allowlisting permanent. If you get really impatient, you can just allow all scripts for that tab and you're done.
It is some extra work, and I won't disagree if you think it's too much, but you're really overselling how much extra work it is.
I've used script blocker modules for many years and consistently only white list a few widely used package repos. This means I notice when a web site I frequent adds a new script source.
This ought to be the default in every common web browser, just as you should have to look at the data sharing "partners" and decide whether they're benign enough for your taste.
>It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
How does this work in reality? Do you just whitelist every site you come across if it's broken? What's the security advantage here? Or do you bail if it requires javascript? What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
> Do you just whitelist every site you come across if it's broken?
I look at the breakage, consider how the site was promoted to me, and make a decision.
> What's the security advantage here?
Most of the bad stuff comes from third parties and doesn't provide essential functionality. A whitelist means you're unblocking one domain at a time, starting with the first party. If there's still an issue, it's usually clear what needs unblocking (e.g. a popular CDN, or one with a name that matches the primary domain) and what's a junk ad server or third-party tracking etc. You can even selectively enable various Google domains for example so that GMail still works but various third-party Google annoyances are suppressed.
> What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Depends on trust levels of course, but there's at least some investigation that can be done to see that it actually is coming from Anubis or Cloudflare.
First, I use both uBO + NoScript + ClearURLs (removes tracking from URLs) + FastForward (Circumvents sites like adfly) + A pop-up blocker of your choice (stronger blocking than default also whitelist only in my case). They're all popular add-ons on Firefox and should also be available on Chrome, or variants of them. You don't need them all, uBO is more than fine for most use cases, I've just gotten used to it for a few years.
>Do you just whitelist every site you come across if it's broken?
Mostly, yes, often temporarily for that session, unless I do not trust a website, then I leave. How I deem what is trustworthy or not is just based on my own browsing experience I guess.
>What's the security advantage here?
You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Or do you bail if it requires javascript?
If I don't trust a website, yes.
>What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Take Hacker News. It's viable without JS, I can read, navigate and comment, but if I want to use the search function, I need to whitelist algolia.com (which powers the search) or else I just see "This page will only work with JavaScript enabled". The search function not working is the most common issue you'll find if you block all JS by default.
>You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Doesn't the default ublock filter lists, plus maybe an extension for auto-closing cookie banners get most of those?
uBO has a different purpose. It's essentially a blacklist with sophisticated measures in place to fix breakage it causes. In many cases it selectively filters content that is otherwise allowed through. IIRC youtube is an example of an extensive such cat and mouse game.
A whitelist approach is less nuanced but far more extensive. It defaults to defending you against unknown vulnerabilities.
uBO can also block JS, yes, and I use both add-ons, but I find NoScript's UI to be more intuitive to use to manage JS, and I've been using it for years now.
It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
It's the frontend version of people writing about distributed clusters to handle a load that a single minipc could comfortably handle.
>It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
Seems only narrowly applicable. I can see how you can use this logic to discount articles like "how to make a good blog" or whatever, but that's presumably only a tiny minority of article you'd read. If the topic is literally anything else it doesn't really hold. It doesn't seem fair to discount whatever an AI engineer or DBA has to say because they don't share the same fanaticism of lightweight sites as you. On the flip side I see plenty of AI generated slop that works fine with javascript disabled, because they're using some sort of SaaS (think medium) or static site generator.
Generally speaking, getting good performance out of a database mostly comes down to understanding how the thing works and then not making it do a stupid amount of unnecessary work. I expect someone who understands that would also not e.g. fetch a script that then fetches the actual text instead of just sending the text. For example Markus Winand's sites work just fine with javascript off.
For ML stuff I'd let e.g. mathjax fly, but I expect the surrounding prose to show up first to get me interested enough to enable scripts.
It's not an exact filter, but it gives some signal to feed into the "is this worth my time" model.
It's also odd to characterize it as fanaticism: scriptless sites are the default. If you just type words, it will work. You have to go out of your way to make a Rube Goldberg machine. I'm not interested in Rube Goldberg machines or the insights of the people that enjoy making them. Like if you own a restaurant and make your menu so that it's only available on a phone, I'll just leave. I don't appreciate the gimmick. Likewise for things that want me to install an app or use a cloud. Not happening.
Rather then fanaticism I'm going to second that I find it to be a useful signal. The people that I find worthwhile to read generally have an interest in tinkering and exhibit curiosity about what's under the hood. Same reason HN lends itself to better discussions than other venues.
Very approximately: there's a group that took the time to understand and attempt to build something robust, a group that has no interest in web except as a means to an end so threw it at a well reviewed static site generator, and a group that spent time futzing around with a rube goldberg machine yet didn't bother to seek deeper understanding.
> Do you just whitelist every site you come across if it's broken? What's the security advantage here?
Most websites load their required scripts from their own domain. So you allowlist the domain you are visiting, and things just work. However, many websites also load JS from like 20 other domains for crap like tracking, ads, 3rd party logins, showing cookie popups, autoplaying videos, blah blah blah. Those stay blocked.
Try it out: Visit your local news website, open your uBlock Origin panel, and take a look at all the domains in the left half. There will probably be dozens of domains it's loading JS from. 90% of the time, the only one you actually need is the top one. The rest is 3rd party crap you can leave disabled.
And yeah, if a website doesn't work after allowlisting two or three domains, I usually just give up and leave. Tons of 3rd party JS is a strong indicator that the website is trying to show you ads or exploit you, so it's a good signal that it's not worth your time.
For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
The challenge is sites like StackOverflow which don't completely break, but have annoying formatting issues. Fortunately, uBlock lets you block specific elements easily with a few clicks, and I think you can even sync it to your phone.
>For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
But that basically negates all security benefits, because all it takes to get a 0day payload to run is to make the content sufficiently enticing and make javascript "required" for viewing the site. You might save some battery/data usage, but if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
I don't block javascript for security reasons, I block it for performance, privacy, and UX reasons. If there's a 0day that can be exploited by random javascript in the browser, UBlock won't save us.
> if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
I don't constantly whitelist sites, only the ones I use regularly (like my email provider). Temporarily enabling JS on a broken site doesn't add it to my whitelist and only takes three clicks (which is muscle memory at this point):
1. Click to open UBlock window
2. Click to allow javascript temporarily
3. Click to refresh the page
Personally, if some random link I click doesn't work without scripts at all, chances are that it's not worth the effort and potential security/privacy compromise anyway. But in many cases, the content is readable, with perhaps some layout breakage. Might even get around paywalls by blocking JS.
Even if other users do indeed whitelist everything needed in order to make sites work, they will still end up with many/most of the third-party scripts blocked.
Investing in NoScript can actually make pages faster, even if you end up enabling a bunch of javascript for functionality. For example, I remember having to whitelist only about half the resources used by homedepot.com. The rest was just shameless surveillance bloat, continually backhauling gobs of data as you were merely viewing the page. The site loaded and scrolled much quicker without it.
FYI, uBlock Origin has javascript blocking features just like NoScript, so if you're already using it as your ad blocker, you don't need a separate extension to block javascript too
This is absolutely true. But personally I find NoScript's UI more intuitive to use for JS domain blocking (mostly because I've been using it for years now). I also used to use uMatrix by the same author as uBO before it was deprecated on Chromium browsers.
If the user did not intend to do the action that the malicious site had their browser perform for their account, it is by definition CSRF. The other site forged a request from the user.
Facebook might not care, but it is obviously a vulnerability. Sites can forge likes from users (which IIRC appear on timelines?).
Facebook does care. Allowing like buttons on third party sites was (at least historically) a major part of their business. Its not like they are just being apathetic here - they actively want people to be able to like things from outside of facebook and put in effort to make that happen.
They're clearly aware of the vulnerability if they close accounts for exploiting it. Techniques to prevent it are well-known and allegedly they have lots of skilled engineers. But those techniques would increase friction a little bit, so they've evidently decided they don't care about the vulnerability.
Seems less like a vulnerability and more like a violation of ToS. Doesn't each involved party have an account? Is there a way for an otherwise unrelated third party to exploit this?
Its not that techniques are known to prevent this issue. It is prevented by default. Facebook has to take active steps to make this work. If they did nothing there would be no vulnerability.
> The Scalable Vector Graphics format is an open standard for rendering two-dimensional graphics.
It would be nice if we had one of those, but SVG is not it, at least not unless you’re willing to gloss HTML as “an open format for rendering reflowable text”. SVG is a full platform for web applications with fixed-layout graphics and rich animations, essentially Flash with worse development tools.
There have been some attempts to define a subset of SVG that represents a picture, like SVG Tiny, but that feels about as likely to succeed as defining JSON by cutting things out of JavaScript. (I mean, it kind of worked for making EPUB from HTML+CSS... If you disregard all the insane feature variation across e-readers that is.) Meanwhile, other vector graphics formats are either ancient and not very common (CGM, EPS, WMF/EMF) or exotic and very not common (HVIF, IconVG, TinyVG).
(My personal benchmark for an actual vector format would be: does it allow the renderer to avoid knowing the peculiarities of Arabic, Burmese, Devanagari, or Mongolian?)
Me too. Adobe just murdered it. All they had to do was convert the exporter to html5 but they messed it up.
The best thing about flash was that it let non-coders create interactive things. The editor with a timeline + code you could stick to objects was the best feature.
I had to learn to code properly after flash died, which probably was a good thing, but I still miss that way of making things.
They did that...it's called Adobe Animate. It's the same timeline-based animation tool, but now it exports JavaScript, and it's existed as part of the Adobe Suite this whole time. They just renamed it around 2015 to disassociate it from Flash Player.
It wouldn't surprise me if the "issue" were just that piracy isn't as normalized as it was 20 years ago, and there are very few kids/teenagers (or even adults) that can justify $264/yr for one program unless they're using it for business purposes. So young people looking for creative outlets look elsewhere, and they learn those things instead.
> The best thing about flash was that it let non-coders create interactive things.
But you are not missing flash, you are Missing Macromedia Director and co. These were wonderful tools, intuitive and easy to use. Flash was an abomination of a format, and it was dragging the web down, also security wise, it's good that Apple and Google killed it.
SVG really is just an awful format. What the market wanted was a clean, easily parseable specification for vector image data based on a solid rendering specification. What it got was an extensible HTML-like scripting language where all the vector stuff was ad hoc and poorly implemented, and where (this is the bit that absolutely drives me up the wall) the actual image data is not stored in the metadata format they chose. You have to parse this entirely different string format if you want to extract the points on your curve or whatever!
To be fair the format was invented in an era before gzip compression. The very compact format and silly attribute names like d are from optimizing for that.
Not sure I've heard that angle before. I'm tempted to agree, but then the choice of *xml* as the enclosing format argues strongly that efficiency wasn't at the forefront of the design criteria.
Also the dates don't work. HTTP/1.1 with gzip/compress/deflate encodints was live in browsers and servers with inline compression well before the standard was published in RFC 2068 in 1997. SVG's spec was four years behind that, and IIRC adoption being pretty glacial as far as completeness and compliance.
Ars article links to Malwarebytes but Ars article is better. The headline is better, it's most interesting that they run code from svg. Ars also adds context how the same hole was also used before to hijack Microsoft accounts and also by the Russians. Whereas Malwarebytes is mostly about pornsite clickjacking to like Facebook posts (and complains about age verification). However it has a bit more technical details too. Read both I guess?
> Security firm Malwarebytes on Friday said it recently discovered that porn sites have been seeding boobytrapped .svg files to select visitors. When one of these people clicks on the image, it causes browsers to surreptitiously register a like for Facebook posts promoting the site.
I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
> I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
A human clicking something on the site tends to get around bot detection and similar systems put in place to prevent automation. This is a basic “get the user to take an action they don’t know the outcome of” attack.
I'm shocked this attack works. I thought the last 15 years of browser dev were largely isolating domains from each other to prevent cross-site attacks, and introducing consent flows for little used and/or dangerous platform features.
Running JS inside an image format sounds like a thing they could add permissions for (or a click-to-play overlay), especially if it can talk to other sites.
> it causes browsers to surreptitiously register a like for Facebook posts promoting the site.
Wouldn't that be discovered pretty quickly, when Bob's family and friends see porn promoted to them because John apparently liked it on Facebook. Eventually, one of them would mention it to him.
"Y'know, Bob, you probably don't want to be liking that on your main Facebook account... Are you feeling OK?"
The user has to click on the image, so I think the SVG is embedding the FB like button onto the page and drawing another element on top of it to hide it.
Bog-standard CSRF is what that is. It’s essentially the second thing you guard against, right after sanitizing inputs to prevent XSS and SQL injection.
I think some people on the SVG design committee were aiming to replace Flash for things like browser games, and wanted animations and javascript and so on to support that role.
That lead to the weird situation where browsers have two ways of embedding an SVG into a web page - embed in an <img> tag and the javascript won't run, but embed it in an <iframe> and it will (but of course iframe height can't auto-size...)
The javascript also means pretty much no user-generated-content sites allow the upload of SVGs. Wikipedia is the only place I can think of - and even they serve the SVG as a PNG almost everywhere.
You can also embed in <object>.
You can also just throw an SVG element straight into your html
xmlns namespaces for the win!
Xmlns is not used in HTML5. You use xmlns to embed html in svg, not the other way around.
I miss XML. It made so much sense. XSLT was awesome.
It’s still out there and supported. Load this page and then view the source: https://emacsformacos.com/atom/release
I want to make something clear: <img src=file.svg> will not execute any scripts nor load foreignObject.
For this to work, the SVG has to be loaded in an iframe or in a new tab. Or it could be inlined in the HTML.
Nothing special about SVG really as long as you (Facebook) treat SVG files as images and don't inline it.
The SVG part only really comes in as a way to hide script tags from anyone looking at the network requests, but even then it seems that the specific SVG was crafted to download further scripts.
So what's the issue here exactly? It seems that Facebook is still somehow affected by XSS? Neat.
The Malwarebytes article[1] explains that the users downloaded (possibly automatically) the SVG files and then opened them in the default viewer, which is MS Edge.
[1] https://www.malwarebytes.com/blog/news/2025/08/adult-sites-t...
Thanks! Yes that would work, just like an .html file would work. The advantages of SVG in this case are
- the OS previews it as an image, but on click it opens a website (which to be fair, once you click on a downloaded file, you're already done)
- SVGs are allowed in any "image/*" form, bypassing certain filters
That doesn't explain anything. Then what?
All the more reason to block all JS by default with add-ons like NoScript or uBO and manage a whitelist.
It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
This used to be the case many years ago. But these days practically every site pulls in content from several other sites, sometimes dozens. Fine tuning noscript to get such a site to work without obscure breakage will take a long time of trial & error, reloading again & again. Now consider that you've to do this for every one of your regular sites.
Noscript is just too painful for people who want to just browse the web. Its the gentoo of browser extensions. People with massive time & patience can do it yes, but the rest of us are best served by uBlock & standard browser protections.
> But these days practically every site pulls in content from several other sites, sometimes dozens.
They do, but as a long-time NoScript user I can tell you from personal experience that this content rarely does anything important, and leaving it out often improves your UX. Problems like you describe pop up... from time to time, for individual sites, maybe a few times a year, and definitely not on "regular sites".
I have an entirely separate browser profile for when I absolutely need to use a site and just don't have the time to tinker around with script permissions.
Depending on how you've set up, you can also just use private browsing / incognito to drop extensions.
> But these days practically every site pulls in content from several other sites, sometimes dozens.
And excluding that content almost invariably improves the page.
> Fine tuning noscript to get such a site to work without obscure breakage will take a long time of trial & error, reloading again & again. Now consider that you've to do this for every one of your regular sites.
No, not really. Usually just the top-level domain is enough. Very occasionally a site will have some other domain they serve from, and it's usually obvious which one to allowlist. It takes like, ten seconds, and you only need to do it once per domain if you make the allowlisting permanent. If you get really impatient, you can just allow all scripts for that tab and you're done.
It is some extra work, and I won't disagree if you think it's too much, but you're really overselling how much extra work it is.
> the gentoo of browser extensions
I was already convinced, you don't need to keep selling it ;)
Breaking such sprawling mess web sites to such a degree you give up and close the tab is a feature, not a bug.
I've used script blocker modules for many years and consistently only white list a few widely used package repos. This means I notice when a web site I frequent adds a new script source.
This ought to be the default in every common web browser, just as you should have to look at the data sharing "partners" and decide whether they're benign enough for your taste.
>It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
How does this work in reality? Do you just whitelist every site you come across if it's broken? What's the security advantage here? Or do you bail if it requires javascript? What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
> Do you just whitelist every site you come across if it's broken?
I look at the breakage, consider how the site was promoted to me, and make a decision.
> What's the security advantage here?
Most of the bad stuff comes from third parties and doesn't provide essential functionality. A whitelist means you're unblocking one domain at a time, starting with the first party. If there's still an issue, it's usually clear what needs unblocking (e.g. a popular CDN, or one with a name that matches the primary domain) and what's a junk ad server or third-party tracking etc. You can even selectively enable various Google domains for example so that GMail still works but various third-party Google annoyances are suppressed.
> What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Depends on trust levels of course, but there's at least some investigation that can be done to see that it actually is coming from Anubis or Cloudflare.
First, I use both uBO + NoScript + ClearURLs (removes tracking from URLs) + FastForward (Circumvents sites like adfly) + A pop-up blocker of your choice (stronger blocking than default also whitelist only in my case). They're all popular add-ons on Firefox and should also be available on Chrome, or variants of them. You don't need them all, uBO is more than fine for most use cases, I've just gotten used to it for a few years.
>Do you just whitelist every site you come across if it's broken?
Mostly, yes, often temporarily for that session, unless I do not trust a website, then I leave. How I deem what is trustworthy or not is just based on my own browsing experience I guess.
>What's the security advantage here?
You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Or do you bail if it requires javascript?
If I don't trust a website, yes.
>What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Take Hacker News. It's viable without JS, I can read, navigate and comment, but if I want to use the search function, I need to whitelist algolia.com (which powers the search) or else I just see "This page will only work with JavaScript enabled". The search function not working is the most common issue you'll find if you block all JS by default.
>You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Doesn't the default ublock filter lists, plus maybe an extension for auto-closing cookie banners get most of those?
uBO has a different purpose. It's essentially a blacklist with sophisticated measures in place to fix breakage it causes. In many cases it selectively filters content that is otherwise allowed through. IIRC youtube is an example of an extensive such cat and mouse game.
A whitelist approach is less nuanced but far more extensive. It defaults to defending you against unknown vulnerabilities.
uBO can also block JS, yes, and I use both add-ons, but I find NoScript's UI to be more intuitive to use to manage JS, and I've been using it for years now.
> Or do you bail if it requires javascript?
It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
It's the frontend version of people writing about distributed clusters to handle a load that a single minipc could comfortably handle.
>It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
Seems only narrowly applicable. I can see how you can use this logic to discount articles like "how to make a good blog" or whatever, but that's presumably only a tiny minority of article you'd read. If the topic is literally anything else it doesn't really hold. It doesn't seem fair to discount whatever an AI engineer or DBA has to say because they don't share the same fanaticism of lightweight sites as you. On the flip side I see plenty of AI generated slop that works fine with javascript disabled, because they're using some sort of SaaS (think medium) or static site generator.
Generally speaking, getting good performance out of a database mostly comes down to understanding how the thing works and then not making it do a stupid amount of unnecessary work. I expect someone who understands that would also not e.g. fetch a script that then fetches the actual text instead of just sending the text. For example Markus Winand's sites work just fine with javascript off.
For ML stuff I'd let e.g. mathjax fly, but I expect the surrounding prose to show up first to get me interested enough to enable scripts.
It's not an exact filter, but it gives some signal to feed into the "is this worth my time" model.
It's also odd to characterize it as fanaticism: scriptless sites are the default. If you just type words, it will work. You have to go out of your way to make a Rube Goldberg machine. I'm not interested in Rube Goldberg machines or the insights of the people that enjoy making them. Like if you own a restaurant and make your menu so that it's only available on a phone, I'll just leave. I don't appreciate the gimmick. Likewise for things that want me to install an app or use a cloud. Not happening.
Rather then fanaticism I'm going to second that I find it to be a useful signal. The people that I find worthwhile to read generally have an interest in tinkering and exhibit curiosity about what's under the hood. Same reason HN lends itself to better discussions than other venues.
Very approximately: there's a group that took the time to understand and attempt to build something robust, a group that has no interest in web except as a means to an end so threw it at a well reviewed static site generator, and a group that spent time futzing around with a rube goldberg machine yet didn't bother to seek deeper understanding.
> because they don't share the same fanaticism of lightweight sites as you.
For me, it's not about sites being lightweight, it's about sites not being trustworthy enough to allow them to run code on my machine.
A bit of fanaticism might be exactly what is needed to push back against the web becoming completely unusable.
> Or do you bail if it requires javascript?
I do. If a site just doesn't work without JS, it's not likely to be a site that is valuable to me so nothing is lost.
> Do you just whitelist every site you come across if it's broken? What's the security advantage here?
Most websites load their required scripts from their own domain. So you allowlist the domain you are visiting, and things just work. However, many websites also load JS from like 20 other domains for crap like tracking, ads, 3rd party logins, showing cookie popups, autoplaying videos, blah blah blah. Those stay blocked.
Try it out: Visit your local news website, open your uBlock Origin panel, and take a look at all the domains in the left half. There will probably be dozens of domains it's loading JS from. 90% of the time, the only one you actually need is the top one. The rest is 3rd party crap you can leave disabled.
And yeah, if a website doesn't work after allowlisting two or three domains, I usually just give up and leave. Tons of 3rd party JS is a strong indicator that the website is trying to show you ads or exploit you, so it's a good signal that it's not worth your time.
For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
The challenge is sites like StackOverflow which don't completely break, but have annoying formatting issues. Fortunately, uBlock lets you block specific elements easily with a few clicks, and I think you can even sync it to your phone.
>For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
But that basically negates all security benefits, because all it takes to get a 0day payload to run is to make the content sufficiently enticing and make javascript "required" for viewing the site. You might save some battery/data usage, but if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
I don't block javascript for security reasons, I block it for performance, privacy, and UX reasons. If there's a 0day that can be exploited by random javascript in the browser, UBlock won't save us.
> if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
I don't constantly whitelist sites, only the ones I use regularly (like my email provider). Temporarily enabling JS on a broken site doesn't add it to my whitelist and only takes three clicks (which is muscle memory at this point):
1. Click to open UBlock window 2. Click to allow javascript temporarily 3. Click to refresh the page
Personally, if some random link I click doesn't work without scripts at all, chances are that it's not worth the effort and potential security/privacy compromise anyway. But in many cases, the content is readable, with perhaps some layout breakage. Might even get around paywalls by blocking JS.
Even if other users do indeed whitelist everything needed in order to make sites work, they will still end up with many/most of the third-party scripts blocked.
[dead]
Investing in NoScript can actually make pages faster, even if you end up enabling a bunch of javascript for functionality. For example, I remember having to whitelist only about half the resources used by homedepot.com. The rest was just shameless surveillance bloat, continually backhauling gobs of data as you were merely viewing the page. The site loaded and scrolled much quicker without it.
FYI, uBlock Origin has javascript blocking features just like NoScript, so if you're already using it as your ad blocker, you don't need a separate extension to block javascript too
This is absolutely true. But personally I find NoScript's UI more intuitive to use for JS domain blocking (mostly because I've been using it for years now). I also used to use uMatrix by the same author as uBO before it was deprecated on Chromium browsers.
Disabling third-party cookies also gets rid of lots of potential abuse vectors with negligible impact on useful functionality.
> Facebook regularly shuts down accounts that engage in these sorts of abuse.
But does not fix the CSRF vulnerability, apparently.
Probably because they need it themselves for data collection.
Its not a CSRF vulnerability if its the intended behaviour.
In a world where same-site cookies are the default you have to actively opt-in to this sort of thing.
If the user did not intend to do the action that the malicious site had their browser perform for their account, it is by definition CSRF. The other site forged a request from the user.
Facebook might not care, but it is obviously a vulnerability. Sites can forge likes from users (which IIRC appear on timelines?).
> Facebook might not care
Facebook does care. Allowing like buttons on third party sites was (at least historically) a major part of their business. Its not like they are just being apathetic here - they actively want people to be able to like things from outside of facebook and put in effort to make that happen.
They're clearly aware of the vulnerability if they close accounts for exploiting it. Techniques to prevent it are well-known and allegedly they have lots of skilled engineers. But those techniques would increase friction a little bit, so they've evidently decided they don't care about the vulnerability.
Seems less like a vulnerability and more like a violation of ToS. Doesn't each involved party have an account? Is there a way for an otherwise unrelated third party to exploit this?
Its not that techniques are known to prevent this issue. It is prevented by default. Facebook has to take active steps to make this work. If they did nothing there would be no vulnerability.
> The Scalable Vector Graphics format is an open standard for rendering two-dimensional graphics.
It would be nice if we had one of those, but SVG is not it, at least not unless you’re willing to gloss HTML as “an open format for rendering reflowable text”. SVG is a full platform for web applications with fixed-layout graphics and rich animations, essentially Flash with worse development tools.
There have been some attempts to define a subset of SVG that represents a picture, like SVG Tiny, but that feels about as likely to succeed as defining JSON by cutting things out of JavaScript. (I mean, it kind of worked for making EPUB from HTML+CSS... If you disregard all the insane feature variation across e-readers that is.) Meanwhile, other vector graphics formats are either ancient and not very common (CGM, EPS, WMF/EMF) or exotic and very not common (HVIF, IconVG, TinyVG).
(My personal benchmark for an actual vector format would be: does it allow the renderer to avoid knowing the peculiarities of Arabic, Burmese, Devanagari, or Mongolian?)
As I recall, SVG was Adobe attempting to supplant Flash by publishing an open standard. It didn't work at the time, so they just bought Macromedia.
I still miss Flash
Me too. Adobe just murdered it. All they had to do was convert the exporter to html5 but they messed it up.
The best thing about flash was that it let non-coders create interactive things. The editor with a timeline + code you could stick to objects was the best feature.
I had to learn to code properly after flash died, which probably was a good thing, but I still miss that way of making things.
They did that...it's called Adobe Animate. It's the same timeline-based animation tool, but now it exports JavaScript, and it's existed as part of the Adobe Suite this whole time. They just renamed it around 2015 to disassociate it from Flash Player.
It wouldn't surprise me if the "issue" were just that piracy isn't as normalized as it was 20 years ago, and there are very few kids/teenagers (or even adults) that can justify $264/yr for one program unless they're using it for business purposes. So young people looking for creative outlets look elsewhere, and they learn those things instead.
It was too late. The world had moved on by then! The iPhone AppStore launched in 2008 I think and then the momentum was too much.
> The best thing about flash was that it let non-coders create interactive things.
But you are not missing flash, you are Missing Macromedia Director and co. These were wonderful tools, intuitive and easy to use. Flash was an abomination of a format, and it was dragging the web down, also security wise, it's good that Apple and Google killed it.
>All they had to do was convert the exporter to html5 but they messed it up.
They did do that. Or am I missing something?
Much like Newgrounds it is still around.
https://haxe.org/
SVG really is just an awful format. What the market wanted was a clean, easily parseable specification for vector image data based on a solid rendering specification. What it got was an extensible HTML-like scripting language where all the vector stuff was ad hoc and poorly implemented, and where (this is the bit that absolutely drives me up the wall) the actual image data is not stored in the metadata format they chose. You have to parse this entirely different string format if you want to extract the points on your curve or whatever!
To be fair the format was invented in an era before gzip compression. The very compact format and silly attribute names like d are from optimizing for that.
Not sure I've heard that angle before. I'm tempted to agree, but then the choice of *xml* as the enclosing format argues strongly that efficiency wasn't at the forefront of the design criteria.
Also the dates don't work. HTTP/1.1 with gzip/compress/deflate encodints was live in browsers and servers with inline compression well before the standard was published in RFC 2068 in 1997. SVG's spec was four years behind that, and IIRC adoption being pretty glacial as far as completeness and compliance.
Original article: https://www.malwarebytes.com/blog/news/2025/08/adult-sites-t...
The linked article just regurtitates the source.
Ars article links to Malwarebytes but Ars article is better. The headline is better, it's most interesting that they run code from svg. Ars also adds context how the same hole was also used before to hijack Microsoft accounts and also by the Russians. Whereas Malwarebytes is mostly about pornsite clickjacking to like Facebook posts (and complains about age verification). However it has a bit more technical details too. Read both I guess?
What's the hole? Neither appear to say.
I guess that obfuscated JS in SVG runs? Then it downloads the script that does shady stuff
That does not explain exactly what is wrong. The site could already run JS. It did not need SVG to do it.
Yeah I guess the original article is not clear on that. Other cases usually involved email but this is not
Finally, a reason why porn in incognito mode is actually a safety mechanism.
Running facebook in incognito mode, or at least in a separate container, is also an essential safety mechanism.
Incognito shares the session, so if you have Facebook and shady sites open in incognito at the same time, you've reintroduced this attack vector.
If you want real isolation, user browser profiles.
Yes loading more than one shady sites in private mode allows them to share data. Only do one shady site at a time then close down.
But it's certainly good advice to check no other windows have been opened.
> Running facebook in a separate container is also an essential safety mechanism.
Yes!. And that container is in a Ffx instance, accessed as a remote app (here now but diff container).
... or just not running Faecesbook at all.
> ... or just not running Faecesbook at all.
At one time I agreed and had even deleted my genuine FB acct. But had to create another one briefly in 2021 to find a rental - where I live now.
I still have my ancient fake FB acct for Marketplace, etc but it's walled off.
This makes no sense. How does SVG click Facebook like button? Is there a vulnerability? The post doesn't say anything like that.
Why are they clicking like buttons instead of stealing money from bank accounts then?
Yeah, at first I thought this was about a browser 0day.. but no, so where is the vulnerability? Is Facebook vulnerable?
> Security firm Malwarebytes on Friday said it recently discovered that porn sites have been seeding boobytrapped .svg files to select visitors. When one of these people clicks on the image, it causes browsers to surreptitiously register a like for Facebook posts promoting the site.
I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
> I think I'm missing something; if you can embed arbitrary JavaScript in the SVG, why is a click necessary to make that JavaScript run? And if JavaScript on your page can exploit CSRF on Facebook, why is embedding it in an SVG necessary?
A human clicking something on the site tends to get around bot detection and similar systems put in place to prevent automation. This is a basic “get the user to take an action they don’t know the outcome of” attack.
Yeah browser often blocks clicks on things without them originating from a true user click.
Eg you can’t enable sound on a webpage without a real click.
I'm shocked this attack works. I thought the last 15 years of browser dev were largely isolating domains from each other to prevent cross-site attacks, and introducing consent flows for little used and/or dangerous platform features.
Running JS inside an image format sounds like a thing they could add permissions for (or a click-to-play overlay), especially if it can talk to other sites.
> it causes browsers to surreptitiously register a like for Facebook posts promoting the site.
Wouldn't that be discovered pretty quickly, when Bob's family and friends see porn promoted to them because John apparently liked it on Facebook. Eventually, one of them would mention it to him.
"Y'know, Bob, you probably don't want to be liking that on your main Facebook account... Are you feeling OK?"
I'm curious how you can click the like button using JavaScript…
The user has to click on the image, so I think the SVG is embedding the FB like button onto the page and drawing another element on top of it to hide it.
That's why you run compartmentalised sessions so your real life wouldn't bleed into the porn habits you have.
"[...] porn sites have been seeding boobytrapped .svg files [...]"
I see what you did there ;)
People still have Facebook accounts? I genuinely don't know why anyone does at this point.
If you are a woman, did you know Facebook has been stealing menstruation data from apps and using it to target ads to you?
If you take photos with your smartphone, you know meta has been using them to train their ai? Even if you haven't published them on Facebook?
To say nothing of Facebook's complicity in dividing cultures and fomenting violence and hate...
> People still have Facebook accounts?
Facebook Marketplace supplanted a number of earlier sites - like Hotpads for rentals and Craigslist for cars.
In mid 2021 there were hundreds of applicants per rental listing - even on Marketplace. No one had a luxury of listing preference.
A ton of friends still use Facebook messenger as their primary way to reach them
Where is the SVG only?
“The user will have to be logged in on Facebook for this to work, but we know many people keep Facebook open for easy access.”
Well there's your problem right there.
Bog-standard CSRF is what that is. It’s essentially the second thing you guard against, right after sanitizing inputs to prevent XSS and SQL injection.
[dead]