The polyfill.js is a popular open source library to support older browsers. 100K+ sites embed it using the cdn.polyfill.io domain. Notable users are JSTOR, Intuit and World Economic Forum. However, in February this year, a Chinese company bought the domain and the Github account. Since then, this domain was caught injecting malware on mobile devices via any site that embeds cdn.polyfill.io. Any complaints were quickly removed (archive here) from the Github repository.
nah. over 100k sites ignored dependency risks, even after the original owners warned them this exact thing would happen.
the real story is 100k sites not being run appropriately.
The hackers just engaged in a little bit of technical debt collecting ;)
I’m stealing this phrase
Please do! I’ve been trying to make it stick for almost a decade!
That’s not how systemic problems work.
This is probably one of the most security ignorant takes on here.
People will ALWAYS fuck up. The world we craft for ourselves must take the “human factor” into account, otherwise we amplify the consequences of what are predictable outcomes. And ignoring predictable outcomes to take some high ground doesn’t cary far.
The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.
Damn near everything you interact with on a regular basis has been designed at some point in time with human psychology in mind. Built on the shoulders of decades of research and study results, that have matured to the point of becoming “standard practices”.
Ok, people will always fuck up, so what do you do?
The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.
All the organizations (including public) getting ransomware and data stolen, it’s because the consequences are not that bad? It is not gross negligence?
I’m not sure if this is just a rhetorical question or a real one?
Because I didn’t claim it isn’t negligence. It is negligent, however, it is not a problem solvable by just pointing fingers. It’s a problem that solvable through more strict regulation and compliance.
Cyber security is almost exactly the same as safety in other industries. It takes the same mindset, it manifests in the same ways under the same conditions, it tends to only be resolved and enforced through regulations…etc
And we all know that safety is not something solvable by pointing fingers, and saying “Well Joe Smo shouldn’t have had his hand in there then”. You develop processes to avoid predictable outcomes.
That’s the key word here, predictable outcomes, these are predictable situations with predictable consequences.
The comment above mine is effectively victim blaming, it’s just dismissing the problem entirely instead of looking at solutions for it. Just like an industry worker being harmed on the job because of the negligence of their job site, there are an incredibly large number of websites compromised due to the negligence of our industry.
Just like the job site worker who doesn’t understand the complex mechanics of the machine they are using to perform their work, the website owner or maintainer does not understand the complex mechanics of the dependency chains their services or sites rely on.
Just like a job site worker may not have a good understanding of risk and risk mitigation, a software engineer does not have a good understanding of cybersecurity risk and risk mitigation.
In a job site this is up to a regulatory body to define, utilizing the expertise of many, and to enforce this in job sites. On job sites workers will go through regular training and exercises that educate them about safety on their site. For software engineers there is no regulatory body that performs enforcement. And for the most part software engineers do not go through regular training that informs them of cybersecurity safety.
I’m not blaming the single person who did a mistake, I’m blaming the negligence of the companies that cut corners for profit, so most of them.
Your first comment read as if organizations where this happens couldn’t have bad consequences. Your new comment explains what you meant better, and I agree.
People will ALWAYS fuck up. The world we craft for ourselves must take the “human factor” into account, otherwise we amplify the consequences of what are predictable outcomes.
So what does it say about us diverting from purely server-side scripted message boards with pure HTML and tables, and not a line of JS? Yes, let’s get back there please. And no phone numbers.
The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.
Boeing - we know where you’re goeing.
Damn near everything you interact with on a regular basis has been designed at some point in time with human psychology in mind. Built on the shoulders of decades of research and study results, that have matured to the point of becoming “standard practices”.
There’s one industry which kinda started like this, with proper HIG and standard key combinations and proven usability with screenreaders or by people with color blindness, autism, ADHD, whatever.
Then came in people talking with the tone similar to, sorry, yours in the “People will ALWAYS fuck up” part came saying that people want nice, dynamic, usable websites with lots of cool new features, people are social, they want girls with real photos, names and phone numbers on their forums which BTW should be called social nets.
By the way, we already had that with Flash and Java applets, some things of what I remember were still cooler than modern websites of the “web application” paradigm are now. And we had personal webpages with real names and contacts and photos. And there were tools allowing to make them easily.
These people just hated the existing culture with its individualism and depth, the web applications should be able to own you and not be just another kind of embedded content, the personal webpages should be all the same, and of course normies wouldn’t want to come as guests into the nerdspace - no, they had those new social nets as their space, looking down on those nerds and freaks of my kind.
Now - well, try using today’s web as a person impaired in any way.
And those normies can’t really use it too, and too feel impaired, they just won’t admit it.
So what does it say about us diverting from purely server-side scripted message boards with pure HTML and tables, and not a line of JS? Yes, let’s get back there please.
Ironically, proper SSR that has the server render the page as pure HTML & CSS is becoming more and more popular lately thanks to full-stack meta frameworks that make it super easy. Of course, wanting to go back to having no JS is crazy — websites would lose almost all ability to make pages interactive, and that would be a huge step backwards, no matter how much nostalgia you feel for a time before widespread JS. Also tables for layout fucking sucked in every possible way; for the dev, for the user, and for accessibility.
people want nice, dynamic, usable websites with lots of cool new features, people are social
That’s right, they do and they are.
By the way, we already had that with Flash and Java applets, some things of what I remember were still cooler than modern websites of the “web application” paradigm are now.
Flash and Java Applets were a disaster and a horrible attempt at interactivity, and everything we have today is miles ahead of them. I don’t even want to get into making arguments as to why because it’s so widely documented.
And we had personal webpages with real names and contacts and photos. And there were tools allowing to make them easily.
There are vastly more usable and simple tools for making your own personal websites today!
Ironically, proper SSR that has the server render the page as pure HTML & CSS is becoming more and more popular lately thanks to full-stack meta frameworks that make it super easy.
I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.
Of course, wanting to go back to having no JS is crazy — websites would lose almost all ability to make pages interactive, and that would be a huge step backwards, no matter how much nostalgia you feel for a time before widespread JS.
I disagree. Geminispace is very usable without scripts.
That’s right, they do and they are.
Well, then it appears they don’t care for what I need, so I don’t care for what they need. If only one paradigm must remain, then naturally I pick mine. If not, then there’s no problem and I still shouldn’t care.
And those industry rules I was answering about are about making a thing work for both, even if being less functional.
Flash and Java Applets were a disaster and a horrible attempt at interactivity, and everything we have today is miles ahead of them. I don’t even want to get into making arguments as to why because it’s so widely documented.
Sorry, but either you still make an argument or this isn’t worth much.
For me it’s obvious that embeddable cross-platform applications as content inside hypertext are much better than turning a hypertext system into some overengineered crappy mess of a cross-platform application system.
The security issues with Flash and Java applets weren’t much different from those in the other parts of a web browser back then.
There are vastly more usable and simple tools for making your own personal websites today!
I ask you for links and how many clicks and fucks it would take to make one with these, as opposed to back then. These are measurable, scientific things. Ergonomics is not a religion.
I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.
But why? What’s bad about this?
I disagree. Geminispace is very usable without scripts
That’s great, I’m not saying that it’s impossible to make usable apps without JS. I’m saying that the capabilities of websites would be greatly reduced without JS being a thing. Sure, a forum can be served as fully static pages. But the web can support many more advanced use-cases than that.
If only one paradigm must remain, then naturally I pick mine. If not, then there’s no problem and I still shouldn’t care.
So you can see that other people have different needs to yours, but you think those shouldn’t be considered? We’re arguing about the internet. It’s a pretty diverse space.
For me it’s obvious that embeddable cross-platform applications as content inside hypertext are much better than turning a hypertext system into some overengineered crappy mess of a cross-platform application system.
Look, I’m not saying that the web is the most coherent platform to develop for or use, but it’s just where we’re at after decades of evolving needs needing to be met.
That said, embedded interactive content is absolutely not better than what we have now. For one, both Flash and Java Applets were mostly proprietary technologies, placing far too much trust in the corpos developing them. There were massive cross-platform compatibility problems, and neither were in any way designed for or even ready for a responsive web that displays well on different screen sizes. Accessibility was a big problem as well, given an entirely different accessibility paradigm was necessary within vs. the HTML+CSS shell around the embedded content.
Today, the web can do everything Flash + Java Applets could do and more, except in a way that’s not proprietary but based on shared standards, one that’s backwards-compatible, builds on top of foundational technologies like HTML rather than around, and can actually keep up with the plethora of different client devices we have today. And speaking of security — sure, maybe web browsers were pretty insecure back then generally, but I don’t see how you can argue that a system requiring third-party browser plug-ins that have to be updated separately from the browser can ever be a better basis for security than just relying entirely on the (open-source!) JS engine of the browser for all interactivity.
I ask you for links and how many clicks and fucks it would take to make one with these, as opposed to back then. These are measurable, scientific things. Ergonomics is not a religion.
The idea that any old website builder back in the day was more “ergonomic” while even approaching the result quality and capabilities of any no-code homepage builder solution you can use today is just laughable. Sorry, but I don’t really feel the burden of proof here. And I’m not even a fan of site builders, I would almost prefer building my own site, but I recognize that they’re the only (viable) solution for the majority of people just looking for a casual website.
Besides — there’s nothing really preventing those old-school solutions from working today. If they’re so much better than modern offerings, why didn’t they survive?
But why? What’s bad about this?
What I said, literally.
But the web can support many more advanced use-cases than that.
Which can be done with something embeddable, and not by breaking a hypertext system.
So you can see that other people have different needs to yours, but you think those shouldn’t be considered? We’re arguing about the internet. It’s a pretty diverse space.
If those people don’t consider mine, then I don’t consider theirs. If I must consider theirs, they must consider mine.
Look, I’m not saying that the web is the most coherent platform to develop for or use, but it’s just where we’re at after decades of evolving needs needing to be met.
That says nothing. It’s a market\evolution argument. Something changes tomorrow and that will be the result of evolution. Somebody uses a different system and that’s it for them.
That said, embedded interactive content is absolutely not better than what we have now. For one, both Flash and Java Applets were mostly proprietary technologies, placing far too much trust in the corpos developing them.
And today’s web browsers are as open as Microsoft’s OOXML. De facto proprietary.
There were massive cross-platform compatibility problems,
For Flash? Are you sure? I don’t remember such.
and neither were in any way designed for or even ready for a responsive web that displays well on different screen sizes.
Nothing was. Doesn’t tell us anything.
Accessibility was a big problem as well, given an entirely different accessibility paradigm was necessary within vs. the HTML+CSS shell around the embedded content.
Yes, but applet’s problems in that wouldn’t spread to the HTML page embedding it. Unlike now.
Today, the web can do everything Flash + Java Applets could do and more, except in a way that’s not proprietary but based on shared standards, one that’s backwards-compatible, builds on top of foundational technologies like HTML rather than around, and can actually keep up with the plethora of different client devices we have today.
I’ve already said how it’s similar to OOXML. Only MS documented their proprietary at the moment standard of their proprietary program and made it open, while Chromium is itself open, but somehow that doesn’t make things better.
And speaking of security — sure, maybe web browsers were pretty insecure back then generally, but I don’t see how you can argue that a system requiring third-party browser plug-ins that have to be updated separately from the browser can ever be a better basis for security than just relying entirely on the (open-source!) JS engine of the browser for all interactivity.
That’s similar to the Apple walled garden arguments. It’s valuable in areas other than security because of separating power between some browser developer and some plugin’s developer. And fighting monoculture is also good for security.
Also people still use plugins, still separately updated, which still get compromised.
Also plugins can be properly sandboxed.
The idea that any old website builder back in the day was more “ergonomic” while even approaching the result quality and capabilities of any no-code homepage builder solution you can use today is just laughable. Sorry, but I don’t really feel the burden of proof here. And I’m not even a fan of site builders, I would almost prefer building my own site, but I recognize that they’re the only (viable) solution for the majority of people just looking for a casual website.
Sorry, I still do feel that burden of proof. Because for a static site like in 2002 I’d just export a page from OpenOffice and edit some links, and then upload it.
I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.
??? Please don’t make weird blanket statements like this.
OK, makes me feel nausea. Even better.
One place I worked at recently was still using Node version 8. Running
npm install
would give me a mini heart attack… Like 400+ critical vulnerabilities, it was several thousand vulnerabilities all around.Running
npm install
would give me a mini heart attackIt should; but more because it installs things right off the net with no validation. Consistency of code product is not the only thing you’re tossing.
How else would you get LPAD ? Expect me to write 2 lines of code when I could just import a 100 Mb library to do it for me?
You need to get up to date from three years ago. NodeJS 16.20, or thereabouts, enabled dependency auditing by default.
I’m still fighting my engineers go get current enough to use this (but we do have a proxy artifact server that also attempts to keep downloads clean, and a dependency scanner)
After the first 100, the other 300 kinda don’t matter.
If you’re on RHEL 8+, you can install the latest version of node with dnf.
dnf install nodejs
will likely install node 8 :(. Usednf module install nodejs:20
to install the latest version.
the real story is 100k sites not being run appropriately.
Same as it ever was. Same as it ever was. Same as it ever was.
Yeah this is just capitalistic business in general. Don’t do anything proactive if it might reduce the bottom line in the short term. Blame others and beg for help when you weren’t proactive. Succeed singularly, fail collectively
You just described my coworker…
Described every corporation ever incorporated.
This isn’t holding up, time isn’t after us.
100k sites ignored dependency risks
JS: typing systems are boring, warnings are boring, security is boring.
Sure, the package managers of other languages are super safe
You’re confused. It’s unrelated to package managers, it’s about basic security principles like this: https://developer.mozilla.org/en-US/docs/Web/Security/Subresource_Integrity but JS devs don’t care.
Finding new ways webshits fuck up the most basic development principles boggles my mind. It’s like they intentionally stay ignorant.
We know, but we don’t have time to change. We have another site waiting to get slammed out as soon as the one we’re working on, which was underfunded with a ridiculous timeline goes live.
There’s still a fair bit of “my nephew makes websites, it can’t be that [hard, expensive, time consuming], oh and by the way, e we need a way to edit every word and image on the site, that both our intern and barely literate CEO can understand, even though we’re literally never going to edit anything ever.”
They’re widely variable. PyPI gets into about as much trouble as npm, but I haven’t heard of a successful attack on CPAN in years (although that may be because no one cares about Perl anymore).
But now where am I going to get my “china maleware” headline !?!
I don’t think we have to choose. “Maintain your websites so you don’t get taken advantage of” and “Here’s an example of a major-world-power-affiliated group exploting that thing you didn’t do” are both pretty important stories.
I mean, both are true? It’s not a manipulative headline in my opinion.
The malware thing still deserves a headline. They just argue it’s stupid so many even have to use the library to begin with.
Have to use? No one has to use any library. It’s convenience, and in this case it’s literally so they don’t have to write code for older browser versions.
The issue here isn’t that anyone has to use it, it’s the way it was used that is the problem. Directly linking to the current version of the code hosted by a third party instead of hosting a copy yourself.
Probably at your local asian gay bar.
This is why ublock origin is an essential security tool.
What rules can we add that solve this problem? (I’ve tried DDG but didn’t find any results)
This one is already in the default
uBlock filters - Badware risks
I also strongly suggest adding https://big.oisd.nl/ as a filter list. It’s a large and well maintained domain blocklist (sourced from combining lots of other blocklists) that usually adds lots of these sorts of domains quickly and has very few false positives.
If you want to take it even further, check out the Pro list and Thread Intelligence Feeds list here https://github.com/hagezi/dns-blocklists
These can all be added to a pihole too if you use one.
Looks like I’ll need to set up pihole then. Thanks for the info!
cdn.polyfill.io^ ? By now it was probably already added to the default lista tho…
(I dont really understant these things tho, so correct me if I’m wrong)
My favourite part is that the developers that currently own it said:
Someone has maliciously defamed us. We have no supply chain risks because all content is statically cached
https://github.com/polyfillpolyfill/polyfill-service/issues/2890#issuecomment-2191461961
Completely missing the point that they are the supply chain risk, and the fact that malicious code was already detected in their system (to the point where Google started blocking ads for sites that loaded polyfill .io scripts.
We don’t even know who they are - the repo is owned by an anonymous account called “polyfillpolyfill”, and that comment comes from another anonymous account “polyfillcust”.
Pollyfilladmin
Pollyfillpassword
Pollyfillnoteventryn
I’m not into JS stuff, but when I read that google is blocking ads, shit must be flowing in gargantuan amounts…
Reposting my comment from Github:
A good reminder to be extremely careful loading scripts from a third-party CDN unless you trust the owner 100% (and even then, ownership can change over time, as shown here). You’re essentially giving the maintainer of that CDN full control of your site. Ideally, never do it, as it’s just begging for a supply chain attack. If you need polyfills for older browsers, host the JS yourself. :)
If you really must load scripts from a third-party, use subresource integrity so that the browser refuses to load it if the hash changes. A broken site is better than a hacked one.
And on the value of dynamic polyfills (which is what this service provides):
Often it’s sufficient to just have two variants of your JS bundles, for example “very old browsers” (all the polyfills required by the oldest browser versions your product supports) and “somewhat new browsers” (just polyfills required for browsers released in the last year or so), which you can do with browserslist and caniuse-lite data.
Yeah I used to be guilty of this. Although in slight defense of myself I never used to use random sites like that I always used to pull everything from Google CDN since I can’t see that changing hands.
They may very well shut it down without warning, but they’re probably not going to sell it to anyone.
Yeah, it really depends on how much you trust the vendor.
Google? Say what you want about the company, but they’ll never intentionally serve malware.
Random company with no track record where we don’t even know who is maintaining the code? Much less trustworthy. The polyfill . io repo is currently owned by a Github user called “polyfillpolyfill” with no identifying information.
Third-party CDNs make less sense these days though. A lot of hosting services have a CDN of some sort. Most sites have some sort of build process, and you usually bundle all your JS and CSS (both your code and third-party code, often as separate bundles) as part of that.
Regular code review for security should be SOP
You’d be surprised how much code people blindly reuse without even looking at it, especially in JavaScript. A bunch of it is from projects owned by random individuals. The JS standard library is ridiculously small, so nearly all JS apps import third-party code of some sort. One JS framework can pull in hundreds of third-party modules.
It’s much less of an issue with languages like C# and even PHP, where the first-party libraries are often sufficient for building a small or mid-sized app.
JS and Python are both extremely bad for this. I’ve been working with data scientists and it’s hell trying to tell them that no, they can’t just install whatever libraries they want
Man, the Chinese are becoming a new major nuisancec on internet.
Is therea way to ground them for a month while they think about what they did?
Take away his honeyNo more honey on school nightsEdit: could the US/NATO ground China in some debilliting/deterrent way? Like geopolitically/economically spank them and send them to bed without dinner until they come to Jesus?
All Chinese businesses are owned by the CCP, except the ones that get caught being naughty. Suddenly those are a private business with no ties to the party.
All it would really take is internet providers to black hole the China AS numbers in their BGP configs. Then boom China basically can’t talk to the rest of the world.
This should be done with the new axis of evil and let them see how much they truly hate and “need” the destruction of the decadent West. Its insane their shenanigans are still being tolerated at all, cut em off and let them build their own self-sustaining economies and force the West to eliminate their dependance on mercurial and malicious actors on the world stage.
Let’s not make the splinternet a reality, pretty please.
Chinese scaling and manufacturing, Russian IT expertise, Iranian experience of sanctions evasion and North Korean hacking and remote operations mastery are not the combo you want to bet against.
They would absolutely build the self-sustaining economy and rival networks, but in the process it would destroy the Internet as we know it, and break communication channels that are vital for democracy and international peace, while also breaking communications between relatives and friends on the two sides.
China is already trying quite hard with its Great Firewall. We don’t need to make their job easier for them.
Unrelated but last time they came to Jesus in China 20 million people died.
they came to Jesus
I don’t know how viable would that be on a large scale, but they could just ban all China-based companies from operating outside like the US did with Tiktok.
I think that would deal a decent blow on their economy, but I’m far from an expert in those fields so someone who knows better will probably come and debunk me.
Like, I just don’t get how anybody benefits from this whole axis of evil bullshit. They like Western shit, they want it and they want to make it all, what the fuck is their problem. The world is all mapped out now so anybody trying to pull bullshit should be sanctioned to shit and excluded (Olympics and everything else included) and leaders all get investigated by ICC until they fuck off
Its the same problem as always: the assholes overstep and expect to be protectef but not bound while everyone else is bound but not protected. It is an exceedinly conservative (cynical) worldview.
Whichever editor let them post “100 thousand” should be spanked one 100 times with the severed hand of whatever asshole wrote it in the first place.
spanked one 100 times
One 100 times or 1 time, but 100 times?
100x one 100 thousand times
Or one,100? 1,hundred? 1’100? Eleventy 100?
Yes
1 hundred times!
1000 hundred?
What’s the malware do?
Frustrating that the article doesn’t specify and simply links to a different Github page which doesn’t clearly specify the problem either.
I have to assume the site’s article was dynamically generated, without any actual tech journalist doing the reporting. The byline is “Sansec Forensics Team” which doesn’t even link out to the group. Also, the “Chinese Company” isn’t named either it the article or the references, which is incredibly shoddy reporting. The archive link is dead.
This whole page is indicative of the failed state of tech journalism. A genuinely explosive story but its so threadbare and vague that it becomes meaningless.
The site is Sansec. They uncovered it. They also specify how the malware redirects users to sports betting sites.
Makes you hungry an hour later.
Ah frick
Selective advertising redirects
Of course it does.
Wumao trolls incoming in 3…2…
They must earn their 50 cents …
Pretty well paid then
that’s not very nice to call javascript malware. i know it’s bad but still.
That GitHub “archive here” link leads to a page where it hasn’t been archived… (or was the archive removed??).
Looks like someone tried to archive an archived page. You can see
https://web.archive.org/...
is listed twice in the url. I just trimmed off the first one then it works: https://web.archive.org/web/20240229113710/https://github.com/polyfillpolyfill/polyfill-service/issues/2834Sorry, it’s corrected now (and thanks@DocMcStuffin)
Again?
This is probably connected to China cloning the entire GitHub website to their own servers.
I’d be surprised if it’s directly linked
Being linked is how the malware works
🥁 ba dum tish 🥁
poly fill indeed
deleted by creator
Noscript would fix this issue… Deny most of that shit and internet still works… Mostly
Not a solution. Much of the modern web is reliant on JavaScript to function.
Noscript made sense when the web was pages with superfluous scripts that enhanced what was already there.
Much of the modern web is web apps that fundamentally break without JS. And picking and choosing unfortunately won’t generally protect from this because it’s common practice to use a bundler such as webpack to keep your page weight down. This will have been pulled in as a dependency in many projects and the site either works or does not based on the presence of the bundle.
Not saying this is a great situation or anything, but suggesting noscript as a solution is increasingly anachronistic.
This will have been pulled in as a dependency in many projects and the site either works or does not based on the presence of the bundle.
This wasn’t bundled. People inserted a script tag pointing to a third-party CDN onto their sites. The output changes depending on the browser (it only loads the polyfills needed for the current browser) so you can’t even use a subresource integrity hash.
Much of the modern web is reliant on JavaScript to function.
“function” is doing a lot of lifting there. Trackers, ads, and assorted other bullshit is not the kind of functioning anyone needs.
It’s true the average user gets flummoxed quickly when the scripts are blocked, but they can either sink (eat ads and trackers) or swim (learn what scripts to allow). (Spoiler: they almost always sink)
Not a solution. Much of the modern web is reliant on JavaScript to function.
And much of it works better and faster without JavaScript. Some sites don’t work in Noscript, but most sites run faster and work well enough.
.
I only allow JS on a whitelist.
A whitelist wouldn’t mitigate this issue entirely due to bundling
In this case the script wasn’t bundled at all - it was hotlinked from a third party CDN. Adding malicious code instantly affects all the sites that load it.
The output differs depending on browser (it only loads the polyfills your browser needs) so it’s incompatible with subresource integrity.
Imo, computing, like all other things, requires a little trust and risk. The problem is most people are Wayyy to trusting in general.
deleted by creator
Flash was magnitudes worse than the risk of JS today, it’s not even close.
Accessibility is orthogonal to JavaScript if the site is being built to modern standards.
Unfortunately preference is not reality, the modern web uses JavaScript, no script is not an effective enough solution.
deleted by creator
Well, by that measure, you don’t need JavaScript to make inaccessible sites, there are plenty of sites out there that ruin accessibility with just HTML and CSS alone.
It’s always up to the developer to make sure the site is accessible. At least now it seems to be something that increasingly matters to search result rankings
deleted by creator
deleted by creator
Flash ran as a browser plugin (as in not an extension, but a native binary that is installed into the OS and runs beside the browser, we basically don’t do this for anything now)
Flash was pretty much on weekly security bulletins in the final years, arbitrary code execution and privilege escalation exploits were common, that’s why Adobe killed it.
Flash was never safe and comparing JavaScript to it as a greater risk shows you’ve not fully understood the threat model of at least one of the two.
deleted by creator
That’s literally the one main somewhat valid use case for plugins, and it’s basically because of DRM. A plugin that allows arbitrary code to run is a security nightmare, that’s why we don’t do it anymore.
A lot of the security features you describe were added by browser vendors late in the game because of how much of a security nightmare flash was. I was building web software back when this was all happening, I know first hand. People actually got pissy when browsers blocked the ability for flash to run without consent and access things like the clipboard. I even seem to remember a hacky way of getting at the filesystem in flash via using the file upload mechanism, but I can’t remember the specifics as this was obviously getting close to two decades ago now.
Your legitimate concerns about JavaScript are blockable by the browser.
Flash was a big component of something called the evercookie—one of the things that led to stuff like GDPR because of how permanently trackable it made people. Modern JavaScript tracking is (quite rightfully) incredibly limited compared to what was possible with flash around. You could track users between browsers FFS.
You’re starting to look like you don’t know what you’re talking about here.
100% agree. A super-fast text only internet layer is approved.
and internet still works… Mostly
That load-bearing “mostly” is doing a lot of work here.
I invite everybody to find out how everything “mostly” works if you disable “most of” javascript – also have fun deciding which parts to enable because you think they’re trustworthy
I actively do this with uMatrix - granted, I only block non-first-party JavaScript. Most sites I visit only require a few domains to be enabled to function. The ones that don’t are mostly ad-riddled news sites.
There are a few exceptions to this - AWS and Atlassian come to mind - but the majority of what I see on the internet does actually work more or less fine when you block non-first-party JavaScript and some even when you do that. uMatrix also has handy bundles built-in for certain things like sites that embed YouTube, for example, that make this much easier.
Blocking non-first-party like I do does actually solve this issue for the most part, since, according to the article, only bundles that come from the cdn.polyfill.io domain itself that were the problem.
You’re still trusting that the 1st party javascript won’t be vulnerable to supply chain attacks, though
In my experience, first-party JavaScript is more likely to be updated so rarely that bugs and exploits are more likely than supply chain attacks. If I heard about NPM getting attacked as often as I hear about CDNs getting attacked, I’d be more concerned.
Funny that they want you to allow all java scripts but then criticise first party scripts for being unsave.
I bet [insert random autocrat here] would approve of that message.
deleted by creator
I’ve been using noscript for years.
Yeah, it took me about that long to get my regular websites working right too. And then i had to reinstall for unrelated reasons and all that customisation was gone.
While you can back it up, at least once you’ve suffered the loss multiple times you can get it 90% back on first re-visit after reinstall.
deleted by creator
I invite everybody to find out how everything “mostly” works if you disable “most of” javascript – also have fun deciding which parts to enable because you think they’re trustworthy
Having done this for many many years, I can tell you: if you allow the site scripts (which is an acknowledgement of js at least), and a few “big” ones like ajax.google.com, jquery.com, and ytimg.com, etc., you then find a smaller subset of annoying-but-necessary-for-individual-websites that you can enable as needed or just add them as trusted if you’re into that kind of thing.
After that you have the utter garbage sites with 30 scripts of tracking data-sucking bullshit (CNN, looking at you) and for those sites I have said “Thou shalt bite my shiny metal ass” and i just don’t go there.
It’s a concession to js, yes, but it’s also not free rein to trample all over the surfing experience. Totally worth the time to work out.