• A_norny_mousse@feddit.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 days ago

    Yes.

    Many people won’t even know what we’re talking about; to them it’s like saying “the sheer amount of websites that are unusable without HTML”. But I use uBlock Origin in expert mode and block js by default; this allows me to click on slightly* fishy links without endangering my setup or immediately handing my data over to some 3rd party.

    So I’m happy to see news websites that do not require js at all for a legible experience, and enraged that others even hide the fucking plain text of the article behind a script. Even looking at the source code does not reveal it. And I’m not talking about paywalls.


    * real fishy links go into the Tor browser, if I really want to see what’s behind them.

  • Mwa@thelemmy.club
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 days ago

    I just use NOSCRIPT to do this and its annoying to visit websites that need Javascript, but its handy with noscript cause I just turn on the Javascript the website needs for functionality (this should also speed up load times)
    Sometimes if am using a browser without extension support (like Gnome WEB) I just disable Javascript on Websites or frontends that dont need it like Invidious (if am facing issues)

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      There are plenty of modern frameworks most of which as better. Even the lightweight ones need JavaScript.

    • josefo@leminal.space
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      It’s like JavaScript is used way over its reasonable use cases and you need a thick layer of framework indirection to be able to do anything, and yet still sucks.

  • prole@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    I use uBlock medium mode, and if I can’t get a website to work without having to enable JavaScript, then I just leave the website.

    • Hellfire103@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      I generally do the same. In fact, on desktop, uBO is set to hard mode. Unfortunately, I do need to access these sites from time to time.

  • witty_username@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    If I’d want to write a site with js-equivalent functionality and ux without using js, what would my options be?

    • TrickDacy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      I mean you could build a site in next.js, ironically. Which is very counter intuitive because it literally is js you are writing, but you can write it to not do dynamic things so it effectively would be a static server rendered site that, if js is enabled, gets for free things like a loader bar and quick navigation transitions. If js is disabled it functions just like a standard static site.

    • Hellfire103@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      11 days ago

      HTML and CSS can do quite a lot, and you can use PHP or cgi-bin for some scripting.

      Of course, it’s not a perfect alternative. JavaScript is sometimes the only option; but a website like the one I was trying to use could easily have just been a static site.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        The problem is that HTML and CSS are extremely convoluted and unintuitive. They are the reason we don’t have more web engines.

    • dondelelcaro@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      htmx or equivalent technologies. The idea is to render as much as possible server side, and then use JS for the things that can’t be rendered there or require interactivity. And at the very least, serve the JS from your server, don’t leak requests to random CDNs.

      • XM34@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        Htmx requires JS. At that point you already failed in the eyes of the purists. And CDNs exist for a reason. You can’t expect a website to guarantee perfect uptime and response times without the use of CDNs. And don’t get me started on how expensive it would be to host a globally requested website without a CDN. That’s a surefire way to get a million dollar bill from amazon!

  • Victor@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    People in this thread who aren’t web devs: “web devs are just lazy”

    Web devs: Alright buddy boy, you try making a web site these days with the required complexity with only HTML and CSS. 😆 All you’d get is static content and maybe some forms. Any kind of interactivity goes out the door.

    Non web devs: “nah bruh this site is considered broken for the mere fact that it uses JavaScript at all”

        • _stranger_@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 days ago

          I unironically use Lynx from my home lab s when I’m ssh’d in snce it’s headless. Sometimes at work I miss the simplicity. I used to use Pine for Gmail as well. 😁

    • puppinstuff@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      I can do it but it’s hard convincing clients to double their budget for customers with accessible needs they’re not equipped to support in other channels.

      That being said, my personal sites and projects all do it. And I’m thankful for accessible website laws where I’m from that make it mandatory for companies over a certain size to include accessible supports that need to work when JS is disabled.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        What country or area would that be?

        And what do you mean by “do it”? What is it exactly that you do or make without JavaScript?

        • puppinstuff@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 days ago

          Some provinces in Canada have rules that businesses’ websites must meet or exceed the WCAG 2.0 accessibility guidelines when they exceed a certain employee headcount, which includes screen reader support that ensures all content must be available to a browser that doesn’t have JavaScript enabled.

          • Victor@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 days ago

            That’s excellent.

            And what do you make that doesn’t include JavaScript? Like what kind of software/website/content? If you don’t mind sharing, of course.

            • neclimdul@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 days ago

              It doesn’t have to not include JavaScript, that would be quite difficult and unreasonable. Accessible sites are not about limiting functionality but providing the same functionality.

              I haven’t gone fully down the rabbit hole on this but my understanding is even something like Nuxt if you follow best practices will deliver HTML that can be interacted with and serve individual pages.

              That said, screen readers and other support shouldn’t require running without any JavaScript. Having used them to test sites that might be the smart approach but they actually have a lot of tools for announcing dynamic website changes that are built into ARIA properties at the HTML level so very flexible. There are of course also JavaScript APIs for announcing changes.

              They just require additional effort and forethought to implement and can be buggy if you do really weird things.

              • Victor@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 days ago

                I think we’re on the same page here. Your reply seems to me to argue against the people who are completely against JavaScript and who treat its very presence like a complete site-breaking bug. I am not of their opinion either. But I do sympathize with the sentiment that it is being used for evil.

                • neclimdul@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  9 days ago

                  Yeah, I don’t think that’s what the screenshot shows though since there’s no content at all 😅

            • puppinstuff@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 days ago

              Mostly marketing and informational websites for the public. Businesses, tourism spots, local charities and nonprofits, etc. Nothing that’s going to change the world but hopefully makes somebody’s day a little easier when they need to look something up.

          • neclimdul@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 days ago

            Also the EU and technically a lot of US sites that provide services to or for the government have similar requirements. The latter is largely unenforced though unless you’re interacting with states that also have accessibility laws.

            And honestly a ton of sites that should be covered by these requirements just don’t care or get rubber stamped as compliant. Because unless someone actually complains they don’t have a reason to care.

            I kind of thought the EU requirements that have some actual penalties would change this indifference but other than some busy accessibility groups helping people that already care, I haven’t heard a lot about enforcement that would suggest it’s actually changed.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        If you want to zoom into a graph plot, you want each wheel scroll tick to be sent to the server to generate a new image and a full page reload?

        How would you even detect the mouse wheel scroll?

        All interactivity goes out the door.

      • cerothem@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        11 days ago

        That would make the website feel ultra slow since a full page load would be needed every time. Something as simple as a slide out menu needs JavaScript and couldn’t really be done server side.

        When if you said just send the parts of the page that changed, that dynamic content loading would still be JavaScript. Maybe an iframe could get you somewhere but that’s a hacky work around and you couldn’t interact between different frames

        • Limonene@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 days ago

          a slide out menu needs JavaScript

          A slide out menu can be done in pure CSS and HTML. Imho, it would look bad regardless.

          When if you said just send the parts of the page that changed, that dynamic content loading would still be JavaScript

          OP is trying to access a restaurant website that has no interactivity. It has a bunch of static information, a few download links for menu PDFs, a link to a different domain to place an order online, and an iframe (to a different domain) for making a table reservation.

          The web dev using javascript on that page is lazy, yet also creating way more work for themself.

        • expr@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          https://htmx.org/ solves the problem of full page loads. Yes, it’s a JavaScript library, but it’s a tiny JS library (14k over the wire) that is easily cached. And in most cases, it’s the only JavaScript you need. The vast majority of content can be rendered server side.

          • cerothem@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            11 days ago

            While fair, now you have to have JavaScript enabled in the page which I think was the point. It was never able having only a little bit. It was that you had to have it enabled

          • XM34@feddit.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            11 days ago

            So, your site still doesn’t work without JS but you get to not use all the convenience React brings to the table? Boy, what a deal! Maybe you should go talk to Trump about those tariffs. You seem to be at least as capable as Flintenuschi!

        • Sir_Kevin@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          Something as simple as a slide out menu needs JavaScript and couldn’t really be done server side.

          I’m not trying to tell anyone how to design their webpages. I’m also a bit old fashioned. But I stopped making animated gimmicks many years ago. When someone is viewing such things on a small screen, in landscape mode, it’s going to be a shit user experience at best. That’s just my 2 cents from personal experience.

          I’m sure there are examples of where js is necessary. It certainly has it’s place. I just feel like it’s over used. Now if you’re at the mercy of someone else that demands x y and z, then I guess you gotta do what you gotta do.

    • Frostbeard@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Stop, can only get so erect. Give me that please than the bullshit I have to wade trough today to find information. When is the store open. E-mailadress/phone. Like fuck if I want to engage

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        😆 F—ck, I hear you loud and clear on that one. But that’s a different problem altogether, organizing information.

        People suck at that. I don’t think they ever even use their own site or have it tested on anyone before shipping. Sometimes it’s absolutely impossible to find information about something, like even what a product even is or does. So stupid.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      “nah bruh this site is considered broken for the mere fact that it uses JavaScript at all”

      A little paraphrased, but that’s the gist.

      Isn’t there an article just today that talks about CSS doing most of the heavy-lifting java is usually crutched to do?

      I did webdev before the framework blight. It was manual php, it was ASP, it was soul-crushing. That’s the basis for my claim that javascript lamers are just lazy, and supply-chain splots waiting to manifest.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        11 days ago

        CSS doing most of the heavy-lifting java is usually crutched to do

        JavaScript you mean? Some small subset of things that JavaScript was forced to handle before can be done in CSS, yes, but that only goes for styling and layout, not interactivity, obviously.

        I did webdev before the framework blight. That’s the basis for my claim that javascript lamers are just lazy

        There is some extremely heavy prejudice and unnecessary hate going on here, which is woefully misdirected. Well get to that. But the amount of time that has passed since you did web dev might put you at a disadvantage to make claims about web development these days. 👍

        Anyway. Us JavaScript/TypeScript “lamers” are doing the best with what we’ve got. The web platform is very broken and fragmented because of its history. It’s not something regular web devs can do much about. We use the framework or library that suits us best for the task at hand and the resources we are given (time, basically). It’s not like any project will be your dream unicorn project where you get to decide the infrastructure from the start or get to invent a new library or a new browser to target that does things differently and doesn’t have to be backwards compatible with the web at large. Things don’t work this way.

        Don’t you think we sigh all day because we have to monkey patch the web to make our sites behave in the way the acceptance criteria demand? You call that lazy, but we are working our knuckles to the bone to make things work reasonably well for as many people as we can, including accessibility for those with reduced function. It’s not an easy task.

        … “Lazy.” I scoffed in offense, to be honest with you.

        It’s like telling someone who made bread from scratch they’re lazy for not growing their own wheat, ffs.

        Let’s see you do better. 👍👍👍👍👍👍

    • A_norny_mousse@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      It’s not about using js or not, it’s about failing gracefully. An empty page instead of a simple written article is not acceptable.

    • MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      11 days ago

      Ehhhhh it kinda’ depends. Most things that are merely changing how something already present on the page is displayed? Probably don’t need JS. Doing something cool based on the submit or response of a form? Probably don’t need JS. Changing something dynamically based off of what the user is doing? Might not need JS!

      Need to do some computation off of the response of said form and change a bunch of the page? You probably need JS. Need to support older browsers simply doing all of the previously described things? Probably need JS.

      It really, really depends on what needs to happen and why. Most websites are still in the legacy support realm, at least conceptually, so JS sadly is required for many, many websites. Not that they use it in the most ideal way, but few situations are ideal in the first place.

      A lot of this is just non-tech savvy people failing to understand the limitations and history of the internet.

      (this isn’t to defend the BS modern corporations pull, but just to explain the “how” of the often times shitty requirements the web devs are dealing with)

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        Of course it depends, like all things. But in my mind, there’s a few select, very specific types of pages that wouldn’t require at least a bit of JavaScript these days. Very static, non-changing, non-interactive. Even email could work/has worked with HTML only. But the experience is severely limited and reduced, of course.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        Not sure that was the issue. I mean more that if you use only HTML and CSS all you’ll be able to create would be static sites that only change the contents of the page by full reloads. 🙂

        • NigelFrobisher@aussie.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          There’s this ancient thing called the LAMP stack. Most of the web runs it, and what it does will blow your mind.

    • owsei@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      That site is literally just static content. Yes JS is needed for interactivity, but there’s none here

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        If you have static content, then sure, serve up some SSR HTML. But pages with even static content usually have some form of interactivity, like searching (suggestions/auto-complete), etc. 🤷‍♂️

        • Limonene@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 days ago

          Search is easier to implement without Javascript than with.

          <form method="GET" action="/search">
          <input name="q">
          <input type=submit>
          </form>
          
          • Victor@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 days ago

            Does that little snippet include suggestions, like I mentioned? Of course it’s easier with less functionality.

            • humorlessrepost@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              10 days ago

              Back in my day, we’d take that fully-functional form and do progressive enhancement to add that functionality on top with js. You know, back when we (or the people paying us) gave a fuck.

    • mrgoosmoos@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      it sounds like you’re saying there’s an easy solution to get websites that don’t have shit moving on you nonstop with graphics and non-content frames taking up 60% of the available screen

      it’s crazy that on a 1440p monitor, I still can’t just see all the content I want on one screen. nope, gotta show like 20% of it and scroll for the rest. and even if you zoom out, it will automatically resize to keep proportion, it won’t show any of the other 80%

      I’m not a web dev. but I am a user, and I know the experience sucks.

      if I’m looking at the results of a product search and I see five results at a time because of shitty layout, I just don’t buy from that company

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        I had a bit of trouble following that first paragraph. I don’t understand what it is that you say it sounds like I’m saying.

        Either way, none of what you wrote I disagree with. I feel the same. Bad design does not elicit trust.

    • BackgrndNoize@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      A lot of this interactivity is complete bullshit, especially on sites that are mostly just for static data like news articles, the JS is there for advertisement and analytics and social media and other bullshit

      • humorlessrepost@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 days ago

        News site dev here. I’ll never build a site for this company that relies on js for anything other than video playback (yay hls patents, and they won’t let me offer mp4 as an alternative because preroll pays our bills, despite everyone feeling entitled to free news with no ads)

    • Hellfire103@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      I flip back and forth between Brave and Tor Browser, depending on which one appears less fingerprintable; and I’ve disabled all of the analytics.

      • moseschrute@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 days ago

        The more things you block, the more unique and fingerprintable you become. Blocking JavaScript altogether may mitigate some of that, but you can be fingerprinted even without JS.

        Tor is a little better because they make your browser blend pretty well with other Tor browsers, so instead of being unique 1 of 1 you’re more like 1 out of all Tor users.

        I haven’t looked into this in a couple years, but that is my takeaway last time I went down the privacy/fingerprint rabbit hole.

        • Hellfire103@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          I know, and I’m still researching the best way to mitigate this. So far, I’ve come away with the impression that Tor Browser and Brave do the best jobs of minimising fingerprinting, otherwise I would have just disabled JS in Vanadium and called it a day.

        • LifeInMultipleChoice@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 days ago

          (Not talking about a specific browser, just in general) Maybe I’m misunderstanding but when the VPN makes a request for the page information the request isn’t forwarding the browser information is it? So wouldn’t most of that be mitigated there?

          As in the VPNs sever making the request should show when they scrape that information, not the end user. Maybe I’m not understanding that though.

          • moseschrute@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            10 days ago

            A VPN doesn’t alter the requests your browser is making. It just masks your IP address. So any information about your browser is still sent. The exception would be if your VPN provides some sort of tracker/ad blocking feature where it can block certain requests. But it’s not really a magic switch that prevents websites from tracking you.

      • _stranger_@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 days ago

        it’s still owned by a homophobe that loves crypto, and is likely an antivaxxer.

        He was run out of Mozilla after only eleven days as CEO, and he helped found it!

        the guy is an asshole, and he’s very likely using brave money for evil shit.

  • MonkderVierte@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 days ago

    Skill issue - on the devs side.

    A lot of pages even fail if you only disable 3rd-party scripts.

    I consider them broken, since the platform is to render a Document Object Model, scripting is secondary functionality and no fallbacks are bad practice. Imagine if that were a pdf/epub.

    • Spice Hoarder@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Personally, I love server-side rendering, I think it’s the best way to ensure your content works the way YOU built it. However, offloading the processing to the client saves money, and makes sense if you’re also planning on turning it into an electron app.

      I feel it’s better practice to use a DNS that blocks traffic for known telemetry and malware.

      Personally, I used to blacklist all scripts and turn them on one at a time till I had the functionality I needed.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      All modern browsers have Javascript enabled by default. A good dev targets and tests for mainstream systems.

    • katy ✨@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      wild thing is that with modern css and local fonts (nerdfonts, etc), you can make a simple page with a modern grid and nested css without requiring a single third party library or js.

      devs are just lazy.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      no fallbacks is bad practice.

      This is how you know they’re extra lazy – no “please enable javascript because we suck and have no noscript version”.

      • oddspinnaker9295@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        11 days ago

        It reminds me of flash when it first gained popularity.

        “Please enable flash so you can see our unnecessary intro animation and flash-based interface” at, like, half of local restaurant websites

    • JustARaccoon@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      11 days ago

      But they’re not pdf/e-pub, they’re live pages that support changing things in the DOM dynamically. I’m sorry, I’m not trying to be mean but people not wanting scripting on their sites are a niche inside a niche, so in terms of prioritising fixing things that’s a very small audience with a very small ROI if done they might require a huge rewrite. It’s just not financially feasible for not much of a reason other than puritan ones.

      • MonkderVierte@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 days ago

        More simple websites have some advantages like, less work to maintain, responsivity and accessibility by default.

        Sure, what is already, that is. It starts already at choosing the frameworks.