i absolutely hate how the modern web just fails to load if one has javascript turned off. i, as a user, should be able to switch off javascript and have the site work exactly as it does with javascript turned on. it’s not a hard concept, people.
but you ask candidates to explain “graceful degradation” and they’ll sit and look at you with a blank stare.
If it’s a standard webpage that only displays some static content, then sure.
But everything that needs to be interactive (and I’m talking about actual interactivity here, not just navigation) requires Javascript and it’s really not worth the effort of implementing fallbacks for everything just so you can tell your two users who actually get to appreciate this effort that the site still won’t work because the actual functionallity requires JavaScript.
It all comes down to what the customer is ready to pay for and usually they’re not ready to pay for anything besides core functionallity. Heck, I’m having a hard enough time getting budget for all the legally required accessibility. And sure, some of that no script stuff pays into that as well, but by far not everything.
Stuff like file uploads, validated forms and drag and drop are just not worth the effort of providing them without JS.
file uploads and forms are the easiest to do server side
Not if you want them to be at least halfway user friendly. Form validation is terrible when done completely server side, and several input elements like multiselect dropdowns, comboboxes and searchfields won’t work at all unless supported by client side JavaScript. And have you ever tried to do file previews and upload progress bars purly serverside?
So I guess by fileupload you mean “drop file here and wait an uncertain amount of time for the server to handle the file without any feedback whatsoever.” and by forms you mean “enter your data here, then click submit and if we feel charitable we may reward you with a long list of errors you made. Some of which could have been avoided if you knew about them while filling in previous fields”.
It depends on the type of input validation you’re doing, a bunch of it is built into the browser and you don’t need JS for it.
So - the situation is understood, but the question arises, what does this have in common with a global hypertext system for communication.
Maybe all this functionality should be removed into a kind of plugin, similarly to how it was done with Flash and Java applets and other ancient history. Maybe sandboxed, yes.
Maybe the parts of that kind of plugin relating to DOM, to execution, to interfaces should be standardized.
Maybe such a page should look more like a LabView control model or like a Hypercard application, than what there is now.
One huge benefit would be that Google goes out of business.
It depends on what you are doing
The business customer or the visitor?
The visitor doesn’t exactly have a way to give feedback on whether they’d use a static page.
Stuff like file uploads, validated forms and drag and drop are just not worth the effort of providing them without JS.
Honestly many of today’s frameworks allow you to compile the same thing for the Web, for Java for Android, for Java for main desktop OS’es and whatever else.
Maybe if it can’t work like a hypertext page, it shouldn’t be one.
The business customer who actually pays for the development.
Maybe if you can’t use the web without disabling JS, you shouldn’t?
Progressive Web Apps are the best tool for many jobs right now because they run just about everywhere and opposed to every single other technology we’ve had up until now they have the potential to not look like complete shit!
And the whole cross compilation that a lot of these frameworks promise is a comete pipe dream. It works only for the most basic of use cases. PWAs are the first and so far only technology I’ve used that doesn’t come with a ton of extra effort for each supported plattfrom down the line.
The business customer who actually pays for the development.
Then it’s my duty as a responsible customer to not make it profitable for them, as much as I can.
Maybe if you can’t use the web without disabling JS, you shouldn’t?
Suppose I can use the Web with JS disabled. Just that page won’t be part of my Web.
Yes, of course when the optimization work has been done for you, it’s the easiest.
It’s an old discussion about monopolies, monocultures, standards, anti-monopoly regulations, where implicit consent is a thing and where it isn’t, and how to make free market stable.
I built an internal tool that works with or without js turned on, but web devs want something simple for them with a framework, which is why you have to download 100Mb just for a basic form page.
You’re correct, and I’m going to explain how this happens. I’m not justifying that it happens, just explaining it.
It isn’t that no one knows what graceful degradation is anymore. It’s that they don’t try to serve every browser that’s existed since the beginning of time.
When you develop software, you have to make some choices about what clients you’re going to support, because you then need to test for all those clients to ensure you haven’t broken their experience.
With ever-increasing demands for more and more software delivery to drive ever greater business results, developers want to serve as few clients as possible. And they know exactly what clients their audience use - this is easy to see and log.
This leads to conversations like: can we drop browser version X? It represents 0.4% of our audience but takes the same 10% of our testing effort as the top browser.”
And of course the business heads making the demands on their time say yes, because they don’t want to slow down new projects by 10% over 0.4% of TAM. The developers are happy because it’s less work for them and fewer bizarre bugs to deal with from antiquated software.
Not one person in this picture will fight for your right to turn off JavaScript just because you have some philosophy against it. It’s really no longer the “scripting language for animations and interactivity” on top of HTML like it used to be. It’s the entire application now. 🤷♂️
If it helps you to blame the greedy corporate masters who want to squeeze more productivity out of their engineering group, then think that. It’s true. But it’s also true that engineers don’t want to work with yesteryear’s tech or obscure client cases, because that experience isn’t valuable for their career.
This has to be fixed though. I don’t know, how, but it’s an economic situation bringing enormous damage every moment.
And most of people it affects are, like me, in countries where real political activism is impossible.
This is the next thing that should be somehow resolved like child labor, 8-hour workdays, women’s voting rights and lead paint. Interoperability and non-adversarial standards of the global network.
enormous
It isn’t though. Thats the exact point. It’s a moderate effort that would prevent infinitesimal damage. That’s just not good math. People have to prioritize their time. If you have a numbers case to make about why the damage is so enormous, make it. That’s what it will take to be convincing: numbers.
It is. It’s like the medieval Sound Toll, you can’t measure it well enough because there are no trade routes between the Baltic and the North Sea other than the Sound, the Kiel channel is not yet a thing.
What should be fixed is people. The above described logic is true, it does really happen, and behind it is the idiot desire: to get more money. Not to make a better thing, not to make someone’s life better, not to build something worthwhile - in other words, nothing that could get me out of bed in the morning. When that’s the kind of desires fueling most companies and societies, all things will be going in all kinds of wrong ways
That can’t be fixed. We can’t wait for a different kind of human (what if it’ll be an artificial psychopath anyway) to fix our current thing.
So hard to disrupt means of organizing (for associations, unions and such, unofficial) and building electoral systems (for Internet communities even, why not) are needed ; social media gave people a taste of that to lure them before subverting it all, but the idea is good.
Some sort of a global system. When it’s in place, improvement around will follow.
It can be fixed: we can choose to produce less idiots and more caring people. You are right, of course, that it is not the only thing we should be doing
the idiot desire to get more money
Yes, but we don’t have to make a total caricature out if it. We all need to prioritize our time. That isn’t evil, or broken, or wrong. That’s just life.
Expand this, please. I am sure I did not get you
Developers having a narrower list of browsers to support is not ONLY about greed. You say it is NOT about making something that works to improve people’s lives. And I disagree with that.
You can’t build a good piece of software and try To support every client under the sun since the beginning of time. There is a reasonable point to draw some lines and prioritize.
So while greed is ONE factor, you seem to be saying it’s the only factor, and that people are stupid and broken for doing this. That’s going too far.
It’s unrealistic to expect perfection. Today people want comprehensive client support. Tomorrow they will be outraged at some bug. But few realize: you may have to pick between the two. Because having zero bugs is a lot more achievable if you can focus on a small list of current browser clients. That’s just a fact. The next day they will be upset that there are ads in the site, but it may be ad revenue that pays for developers to fix all the bugs for all browser clients under the sun.
People love to rant online about how NO you should give me EVERYTHING and do it for FREE but this is childish tantruming and has no relationship to reality. Devs are not an endless resource that just gives and gives forever. They are regular people who need to go home at night like anyone else.
I am saying it is about greed because it actually is, since I am yet to see a situation where the ultimate filter for supporting/dropping a client is NOT revenue from people using that client, and here I am talking specifically about companies making money on their product, so no open source, subsidised, hobby projects etc.
Peole love to rant online
They do, now try to catch precisely me doing this
i, as a user, should be able to switch off javascript and have the site work exactly as it does with javascript turned on
Not agreeing or disagreeing, but why?
Because big tech has ruined the internet and uses JavaScript among other things, to track you. Some people blame the plastic pollution on improper recycling when we know that’s exactly what the evil plastic industry wants us to do: blame the consumer. Similarly, people think it’s their responsibility to turn off JavaScript when they should be blaming big tech. Even if you get rid of JavaScript, they will just find other creating ways to track you (source). We have to fix the structures running the tech industry.
I would word it as: I should not have to allow strangers to execute arbitrary code on my PC just so I can view some text and/or images.
JavaScript is directly related to almost everything that makes browser tabs take up more RAM than a typical PC in 1998. There are ways to use it in targeted ways that improve responsiveness (objectively or subjectively). The web as it stands is so far beyond that justification that it’s almost laughable to even bring it up.
I run a personal blog with zero JavaScript; just HTML, CSS, and some pictures. Firefox’s memory snapshot says it uses <3MB on the homepage. Amazon’s homepage is currently giving me 38MB, and this comment section with the Alexandrite frontend is giving me 30MB. Those two may even be at the low end of what’s out there.
I run a personal blog with zero JavaScript; just HTML, CSS, and some pictures. Firefox’s memory snapshot says it uses <3MB on the homepage. Amazon’s homepage is currently giving me 38MB, and this comment section with the Alexandrite frontend is giving me 30MB. Those two may even be at the low end of what’s out there.
then you have outlook and google docs, which use a half a gigabyte of memory each.
Microsoft Teams
Oh yeah. There’s no doubt that modern web tech stacks are inefficient slop - patchwork built upon patchwork.
However, JS has been included in every major browser for well over a decade. It’s industry standard at this point, so I found the position of expecting commercial services to be backwards compatible with a 1998 browser setup a little odd.
What do you think about WebGL apps?
I don’t have a fundamental problem with web apps having access to GPU resources. There’s obviously games that can benefit from that. Engines like Godot and Unreal can directly use a web stack as a build target. It makes sense there.
In general, I don’t have a fundamental problem with any of this being there provided the attack surface area can be managed. Which it isn’t, but that’s another discussion.
I have a problem with the tools being applied indiscriminately. I’d almost say that every site should start vanilla, and you’d have to specifically justify any use of JavaScript.
Low-key I’m disagreeing
I’ve spent the last year building a Lemmy and PieFed client that requires JavaScript. This dependency on JavaScript allows me to ship you 100% static files, which after being fully downloaded, have 0 dependency on a web server. Without JavaScript, my cost of running web servers would be higher, and if I stopped paying for those servers, the client would stop working immediately. Instead, I chose to depend heavily on JavaScript which allows me to ship a client that you can fully download, if you choose, and run on your own computer.
As far as privacy, when you download my Threadiverse client* and inspect network requests, you will see that most of the network requests it makes are to the Lemmy/PieFed server you select. The 2 exceptions being any images that aren’t proxied via Lemmy/PieFed, and when you login, I download a list of the latest Lemmy servers. If I relied on a web server for rendering instead of JavaScript, many more requests would be made with more opportunities to expose your IP address.
I truly don’t understand where all this hate for JavaScript comes from. Late stage capitalism, AI, and SAS are ruining the internet, not JavaScript. Channel your hate at big tech.
*I deliver both web and downloadable versions of my client. The benefits I mentioned require the downloaded version. But JavaScript allows me to share almost 100% code between the web and downloaded versions. In the future, better PWA support will allow me to leverage some of these benefits on web.
Problem is so many websites are slow for no good reason.
And JS is being used to steal our info and push aggressive advertisment.
Which part is unknown to you?
Problem is so many websites are slow for no good reason.
Bad coding is a part of it. “It works on my system, where the server is local and I’m opening the page on my overclocked gamer system”. Bad framework is also a part of it. React, for example, decided that running code is free, and bloated their otherwise very nice system to hell. It’s mildly infuriating moving from a fast, working solution to something that decided to implements basic language features as a subset of the language itself.
Trackers, ads, dozen (if not hundreds) of external resources, are also a big part of it. Running decent request blocking extensions (stuff like ublock origin) adds a lot of work to loading a page, and still makes them seems more reactive because of the sheer amount of blocked resources. It’s night and day.
Problem is so many trains are ugly for no good reason.
And steel is being used to shoot people and stab people aggressively.
I don’t understand why we are blaming the stealing info part on JavaScript and not the tech industry. Here is an article on how you can be tracked (fingerprinted) even with JavaScript disabled. As for slow websites, also blame the tech industry for prioritizing their bottom line over UX and not investing in good engineering.
The matter is not javascript per se but the use companies and new developers do, if everyone used like you there would probably be no problem. A gazillion dependencies and zero optimization, eating up cpu, spying on us, advertisements…
And if you try and use an alternative browser you know many websites won’t work.
Graceful degradation - pfft.
Progressive enhancement - yeah!
this is the way
Graceful degradation is for people that are angry about the future. Progressive enhancement is for people that respect the past. And it’s stupid to not hire someone only because they don’t know a term that you know.
Are you lost? I didn’t talk about hiring practices.
It’s when you call someone a pathetic bottom bitch while wearing an evening gown.
Fucking Reddit and their shite navigation controller that shits the bed when you zoom in.
They also continually forget that you can’t do frontend only validation for things.
Yeah, it should also work without browser exactly as it does with a browser
I thought graceful degradation in terms of web design was mostly just to promote using the latest current browser features but to allow it to fall back to the feature set of, say, 1 or 2 previous browser versions. Not to support a user completely turning off a feature that has been around for literal decades? I think what you’re promoting is the “opposite” side, progressive enhancement, where the website should mostly work through the most basic, initial features and then have advanced features added later for supported browsers.
Not OP, But welcome to my TED talk.
Supporting disabled JavaScript is a pretty significant need for accessibility features. None of the text browsers supported JavaScript until 2017, and there’s still a lot of old tech out there that doesn’t deal well with it.
It wasn’t until the rise of react and angular that this became a big deal. But, It’s extremely common now to send most of the website as code. And even scrapers now support JavaScript.
There’s no “minor point” clause on the term graceful degredation. At the same time, there’s no minimum requirement. Would it be good to be thorough and provide a static page? I’d say yes but it’s not like anyone is going to do that anymore.
The tables have turned, You can no longer live without JavaScript and now you need browsers that lie about your screen resolution, agent and your plugins because mega corps can sniff who you are by the slightest whiff of your configs.
And that’s NOT pretty cool
Thanks for the response, good points all around. The fingerprinting is the most convincing argument to me but I think the accessibility issue you bring up is more important.
i run with scripts disabled by default. it gets annoying at times, but most sites and pages i go to work fine. a few are true ‘apps’ and are whitelisted. random sites that don’t work i just search for an alternative source if i really want to read it. i have separate browser installs with fewer restrictions that i use specifically for certain things (like webmail or the little online shopping i do).
the few web sites that i am responsible for… all work without scripts. many of the visitors i care about have shitty internet, so i don’t want massive js or css bundles in there or tons of unoptimized graphics or media.
I wrote my CV site in React and Next.js configured for SSG (Static Site Generation) which means that the whole site loads perfectly without JavaScript, but if you do have JS enabled you’ll get a theme switching and print button.
That said, requiring JS makes sense on some sites, namely those that act more like web apps that let you do stuff (like WhatsApp or Photopea). Not for articles, blogs etc. though.
requiring JS makes sense on some sites, namely those that act more like web apps that let you do stuff (like WhatsApp
I mean yes, but Whatsapp is a bad example. It could easily use no JavaScript. In the end it’s the same as Lemmy or any other forum. You could post a message, get a new page with the message. Switching chats is loading a new page. Of course JavaScript enhances the experience, makes it more fluid, etc, but messengers could work perfectly fine without JavaScript.
Maybe I’m out of the loop because I do mostly backend, but how do you update the chat window when new chats come in, without JavaScript?
You don’t, I’m saying it would still mostly work. Getting messages as they arrive is nice but not necessary. For example, I personally have all notifications off, and I only see messages when I specifically look for them, no one can reach me instantly. Everyone seems to be missing that we’re talking about degradation here, it degrades, it gets worse with JS disabled. But it shouldn’t straight up not work.
A good example for something that does not work without JS would have been a drawing application like they said, or games, there are plenty of things that literally do not work without JS, but messaging is not one of them. Instant messaging would be of course.
I also feel like everyone seems to be missing that we’re taking about degradation, which isn’t usually “no js at all”, it’s some subset that isn’t supported. People use feature detection to find out of some feature is supported in the browser and if it’s not the they don’t enable the feature the depends on it.
For the chat example, you could argue that a chat can degrade into a bulletin board, but I’d argue that people use chat for realtime messaging so js is needed for the base use case.
If your webpage primarily just displays static information, then I agree that it should work without js or css. Like Wikipedia, or a blog, or news, or a product marketing page, or a forum/BBS.
But there is a huge part of the web that this simply doesn’t apply to, and it’s not realistic to have them put in huge effort to support what can only be a broken experience for a fraction of a percent of users.Did you just propose degrading instant messengers back into email? 😂
How exactly do you propose people actually chat with such a system? Continuously hammering F5 while being actively engaged with another person? 😂
How would you solve end-to-end encryption without JavaScript?
How would a page fetch new messages for you without JS?
You don’t. That’s the gracefull degradation part. You can still read your chat history and send new messages, but receiving messages as they come requires page reload or enabling js.
my only issue with this ideology(the require page load) is, this setup would essentially require a whole new processing system to handle, as instead of it being sent via events, it would need to be rendered and sent server side. This also forces the server to load everything at once instead of dynamically like how it currently does, which will increase strain/load on the server node that is displaying the web page, while also removing the potential of service isolation between the parts of the web page meaning if one component goes down(such as chat history), the entire page handler goes down, while also decreasing page response and load times. That’s the downside of those old legacy style pages. They are a pain in the ass to maintain, run slower and don’t have much fallover ability.
It’s basically asking the provider to spend more to: make the service slower, remove features from the site (both information and functionality wise) and have a more complex setup when scaling, to increase compatibility for a minor portion of the current machines and users out there.
this is of course also ignoring the increase request load as you are now having to resend entire webpages to get data instead of just messages/updates too.
The web interface can already be reloaded at any time and has to do all of this. You seem to be missing we’re talking about degradation here, remember the definition of the word, it means it isn’t as good as when JS is enabled. The point is it should still work somehow.
Just to make sure we are on the same page then, cause I don’t see the issue with my post.
I am using the term “Graceful Degradation” which is meant as a fault tolerance for tech stacks to allow for a critical component to be removed.
This critical component people are talking about is Javascript which is used for all dynamically loaded content, and used for fallover protection so one service going down doesn’t make it so the entire page goes down (also an example of fault tolerance).
The proposed solution given would remove that fault tolerance for the reasons I provided in the original reply, while degrading the users experience due to increased page load time (users reloading the page inconsistently vs consistently to get new information) and increasing maintenance costs and overhead on the provider.
Additionally, the new processing system that you mentioned already exists generally doesn’t, because they(websites) mostly use a dynamic load style nowadays, not a static(as in the client doesn’t change it) page, which is what this type of system would require.
note: edits were for phrasing, and a typo
Blame the ui frameworks like react for this. It’s normalized a large cross-section of devs not learning anything about how a server works. They’ve essentially grown up with a calculator without ever having to learn long division.
PE from server rendering only to a full interactive SPA in the browser is really not trivial both for frameworks and app devs
there are a handful of frameworks that support it fairly ergonomically now but it’s a discipline that takes time and effort
also disabling javascript is a tiny minority use case
Not all frameworks are bad
The problem is the devs/owners not understanding basic fundamentals. They could see a major financial benefit if they make the page snappy and light but apparently no one at these companies realizes that.
I, as a user, should be able to switch off javascript and have the site work exactly as it does with javascript turned on.
I mean… many websites rely on JavaScript, so it’s kind of obvious that they don’t work without it. If it would work without JS in the first place, the website wouldn’t need to embed any JS code.
website wouldn’t need to embed any JS code.
other than the 20 trackers and ad scripts.
Most websites out there could work fine without JavaScript. They rely on it because they can’t be bothered to be better.
Have you ever tried building a modern page without JavaScript.
You can do a lot of things with HTML5 and CSS. It just is very complicated and painful. It isn’t intuitive and the behavior will vary across browsers. What could be a little JavaScript turns into a ton of write only CSS.
Yes, that’s my job.
The point isn’t to emulate the JavaScript functionality somehow. The point is to simply fetch the desired information as a new page load when necessary. The page should work in lynx.
What would they do instead?
https://developer.mozilla.org/en-US/docs/Glossary/Graceful_degradation
But honestly, all i ask is that buttons still work, forms get sent, if you use a more basic browser.
It is a lot simpler to just require JavaScript. It is widely supported and is default enabled on all platforms and browsers.
Sending forms is a built-in functionality and you say it’s simpler to hack your way around it.
I mean, sure, if your framework does it this way. But it shouldn’t. Note that as a bug.
How about serving a proper HTML that contains the data they want to display? Instead of an empty page that tries to load the data via JavaScript.
I miss when JS was just a silly thing you could use to add trails to the mouse cursor to impress anyone who stumbled onto your geocities page
They could just add a text box that says please enable JavaScript.
When I ask a server for a page, it should give me content, not a shitty script and a note that says “here, you do it.”
That isn’t how it works
You are viewing a product
Sorry about your stroke
many websites rely on JavaScript,
which is the problem that most people don’t understand the concept of graceful degradation
the website wouldn’t need to embed any JS code.
That’s the point.
There’s a difference between “wouldn’t work” and “wouldn’t work as nicely”. That’s what this post is about :D Most websites would still work in the same basic way without js.
OP really muddled the waters by writing:
exactly as it does with javascript turned on
That’s obviously impossible and wouldn’t be degraded.
exactly as it does aka forms submit, logging on works, you can achieve the same thing
form validation is dogshit without js
It’s either exactly the same, or it’s gracefully degraded. You’re asking for two opposite things at once.
For what it’s worth I support the notion that fundamental functionality should be supported without Javascript, with good old form submissions.
But I also recognise that you can’t get the exact same behaviour without javascript initiated background GETs and POSTs. Easy example: A scrollable map that streams in chunks as you move it.
Why would someone spend tons of time on something that isn’t needed? Only a few people even know how to turn off JavaScript and chances are they will just turn it back on since nothing works.
so it’s kind of obvious that they don’t work without it.
Uhm, the web is to share content, not to play JS. That’s what graceful degradation is for: the primary usecase should still work, even if the secondary or tertiary doesn’t.
Uhm, the web is to share content, not to play JS
The web doesn’t have a single unified purpose. Even if I hate it as a programming language, JavaScript if the basis almost all client-side browser operations build upon.
Sure, a simple website which just contains information works without it, but if you design a website in which the client does anything interactively and not everything should be processed server-side, it’s not really possible. No matter if you’re talking about a web game, something like Google Earth or an in-browser editor.
All examples that work worse than a native software.