- cross-posted to:
- html@programming.dev
- web_design@lemmy.ml
- cross-posted to:
- html@programming.dev
- web_design@lemmy.ml
Got to disagree there. Websites should work without js. Sure it shouldn’t have fancy animations or whatever but I should be able to read it.
E S O T E R I C
I remember when webservers served content, and didn’t just pass me megabytes of bloated spaghetti and say “here, YOU run this.”
Static pages are fine if you don’t want to interact with them. Books have been around since the 1400s.
But they won’t let you search a whole book for particular name, place, term. Or take your input and calculate answers for you? Or let you create music or art? etc. etc.
You don’t need that to search. In fact, you send the search query and get the response back.
Yes, they let you search the term, it’s called asking the librarian to tell you which page.
Forms that send a post request to the server and the server serves you the page with the answer is how it works. Ajax is cool, sure, but don’t tell us lies, or don’t talk with confidence without knowing.
You ask the librarian how often the word “arrow” is in Lord of the Rings, and they have to tell you?
Not sure what your point is, but that functionality could be built into a website without running any code clientside.
My whole comment was that I’m not sure what the point is, because that’s not how it works in reality. So if you actually want to know, you’ll have to ask the author of the comment above me.
I was trying to make the best of what I could with the bad example they provided…
But as they already responded you, before Ajax was a thing term searches were done via forms. I still state that it has it’s uses, but let’s not pretend like the universe was born with javascript.
But if the best of what you can do with it doesn’t make sense, then why do it? It’s not like it’s helping, just distracting from the parts that DO work.
And kind of the same thing again: The universe wasn’t born with forms, either.
When we did a project for a redesign of our web app at work, when I showed stuff to the UXer he said “I see this is designed by a programmer. Because a programmer says: But it works.”
And indeed, that is how all of the comments I read here feel; that things like a denounced search are just irrelevant toys, instead of a solid part of the toolkit of a professional modern developer.
Y’all forget about forms? And, uh, programs?
I remember doing indexes in html with hyperlinks
I hate this cursed timeline where even finding out the opening hours of a restaurant needs to load half a dozen frameworks.
I hate the padded aesthetic of everything. Like I use an old.reddit clone on lemmy cuz I just wanna access the content ffs.
Good developers can write websites that have non javascript fallbacks.
For example, a form to save settings with a save button, but when javascript loads it hides the save button and makes it automatically save when you toggle options.
So if for any chance you can’t use JS (outdated browser, outdated system, text-based browser, JS disabled by an admin, JS won’t load, assistive technology) then… it’s your fault?
- outdated browser - your fault, and a foolish thing to use considering the modern, online threat landscape
- outdated system - your fault, and like above you probably shouldn’t connect to the internet with that thing
- text-based browser - your fault, and it shouldn’t surprise anyone that websites break when you use a browser that misses major functionality
- JS disabled by an admin - your admin’s fault. Go complain to them, not to anyone else
- JS won’t load - depends on reason for why JS fails to load
- assistive technology - depends on the reason; your assistive software may be broken or misconfigured, or the website fails to follow best practice
Braille interpreters (think a row of nubs that raises up the relevant letter as it “reads” the page) used to have issues with some webpages unless you accessed them via text based browsers. No idea if they still struggle as much but text based browsing will always have a function and place
So, if a webpage fails to load of Firefox it’s our fault for not using Chrome? Following your logic.
Well, if we’ re following my logic, like you claim, then it depends on why the page is failing to load in Firefox: Are you using an outdated version of Firefox or on an outdated system? Is Firefox missing major functionality? Has your admin disabled major functionality in Firefox? Won’t some part of the website load in Firefox and if so, why? Are you using assistive technology in Firefox and if so, is it broken, misconfigured, or does the website not follow best practice?
If it is for another reason, then it obviously depends on that reason
Nono, you expect people to use the most used versions of the tools. Firefox has such a low usage that using the “Firefox version” of the “browser” tool can be interpreted as using an “outdated tool”. You clearly don’t, and neither do I, but some people put the line in a different place than you do and I don’t think it’s fair to say it’s their fault for it.
Sure, for webpages where the objective is to have advanced functionality I do get it, but for news/blog posts, documentation, government pages that should be as robust as possible… There are paces where accessibility of “outdated” tools must be considered.
According to your logic, I can’t blame you for believing that “news/blog posts, documentation, government pages […] should be as robust as possible”, but you also can’t blame anyone else for interpreting objectives and functionality differently, and drawing the line elsewhere. Your post is a rhetorical suicide, and there is no point continuing this line of argument
You missed the point completely, though - you should not expect every user to have a shiny updated browser in a shiny new machine using your website, and blaming them if they don’t.
If you want your website being able to be used by most of users, you better cover all use cases. That includes providing non-js fallbacks. Not doing that and blaming the users instead is just ridiculous.
Or what if you don’t have internet? Is it your fault you can’t access the website?
Also, if your browser does not support latest TLS and does not have latest root certificates, it’s your fault. /s
Let me load HTTP without the S if I want to.
Let me load HTTP without the S if I want to.
No, that let’s companies man-in-the-middle you.
ISPs literally couldn’t help themselves inject ads and other scripts that lagged and broke everything on every website all to chase a few bucks.
HTTPS prevents them from doing that.
This woman is part of the problem of the current internet. There are only a few sites that make sense to only work with JS enabled and federated social media is NOT one of those. Wanna know why? Because all the JS bullshit is just to make shit “pretty”. The data isn’t - rather, should NOT - reside entirely in the JS.
EDIT - To make matters worse, the site in question is this - https://bestestmotherfucking.website/ ; which is “inspired” by Motherfucking Website and Better Fucking Website. I’m thinking this is just trolling and we fell for it
What have you done to say that federated social media doesn’t need js
Do you need javascript to fetch content? No.
Do you need javascript to send content (comments, posts, reactions)? No.
Does federated social media require immediate page updates without refreshing the entire page? No.old.feddit.org works without Javascript, and the user experience isn’t any worse than on the main frontend.
piefed is mostly prerendered.
It still uses javascript for votes and stuff.It’s possible to use forms and an endpoint that returns
204 no content, but there’s not much feedback there.
To make it “pretty”? That’s more CSS I would say, and JS would be more about UX.
Which I haven’t seen anybody here mention. Which is kinda giving antisocial nerds with hobby projects? I mean, in professional development you learn very quickly how ux is correlated to helpdesk tickets.
Like, we can talk about technical purity excercises all day, but code doesn’t exist for its own sake.
Nope. Fuck your site.
So… there’s a practical difference between rendering markup, which is handled by the browser engine and generally benign, and running executable script, which is frequently malicious.
Allowing your website to load JavaScript means that I’m allowing you to execute arbitrary code on my hardware. Hopefully the potential blast radius of any malicious code is limited by safety precautions in my web browser, but a web browser is not a security barrier and should not be relied on to protect the local system from malicious code downloaded from the Internet. The most pernicious and seemingly unavoidable behavior of JavaScript on most websites is device fingerprinting, and to get a better understanding of how much of a problem that is check out https://coveryourtracks.eff.org/
The simplest step to prevent a lot of this malicious behavior is to block executable script. This is not really a new thing on the Internet, as extensions like NoScript have been around for 2 decades and have millions of users. This should be anticipated by the web developer as a completely normal use case.
Competent web developers understand that they have privacy-conscious users who block external executable script as a matter of course. Your website(s) should be designed to account for this, and should at least render and display information in a readable way without needing to execute your un-vetted code on the user’s system. Maybe some dynamic functions of the website don’t work, but that’s OK as long as the majority of the site is accessible. A JavaScript-dependent website is no better than a Flash-dependent website, in terms of security, privacy, and professionalism.
NoScript frames this as a consent issue, and that’s probably valid:
NoScript enables consensual browsing: your browser, your choice!
Counterpoint: If I host a website on my server, I can do whatever the fuck I want (within legal limits). Unless NoScript users are a sizable fraction of my userbase or target market, it makes no financial sense to spend time or resources on developing a fall-back without Javascript.
Isn’t javascript the source of vulnerabilities in software? Enabling it is a security threat. This is victim blaming by whomever this woman is.
JavaScript has been my favorite language for a decade. Still, I try to make websites server-rendered so that they can be read if my code fails to load or execute. For example, there are power outages in Ukrainian cities for most of the day because of the war. When there’s no power, there’s still 4G for a while but it switches to economic mode and slows down to a crawl. The websites of the monopolist energy company require a lot of JavaScript. It often fails to load for me during the outage. It’s also not keyboard-accessible because of how its JS is implemented (I won’t image I’d do better, they have a team while I’m a solo programmer, but I try and they don’t). For me to see when there will be electricity at what place and plan where to go study and work, I have to rent a VPS, scrape their website and show me a static table that doesn’t require JS to load. Some code to see what I mean: https://codeberg.org/nykula/powerup
JavaScript has been my favorite language for a decade. Still, I try to make websites server-rendered so that they can be read if my code fails to load or execute
Have you tried Astro? It’s good for exactly this. You write Astro components that look a little bit like React components, but they’re all rendered either during the build (when using static site generation) or server-side.
You get the developer experience of a modern JS framework, with the output of a static site with minimal JS.
Yes, I tried multiple popular SSR frameworks and use one at work. As a hobby, I’ve been making my own SSR framework that is much more minimal, based on Preact, Valibot, Vite, node:sqlite, URLPattern, gettext.js and a few companion libraries. (But components look more like old-school Mithril than React because no JSX extension, just standard JS.) I want its node_modules to stay below 200 MB and to pick such dependencies that the apps built with it can be included in Debian repositories and potentially FreedomBox. Hopefully I’ll be ready to make a fedi post about it next month.
JavaScript was my first language because my initials are JS. After spending some time on programming.dev and seeing how many people bitch about JavaScript, I wrote a Python templating engine to convert Markdown into static HTML with CSS. I have like 10 lines of JavaScript that pre-populates a selector based on the URI’s query string, but that’s it. I got a perfect score on my Lighthouse report (and learns it gives you confetti when you do).
It took some creative problem solving, but I discovered that I didn’t need like 99% of the JavaScript or PHP that I was using. What I needed was mostly to get good at CSS.
There’s a difference between a website and a web app. Websites indeed should not require JavaScript to function. Web apps are a different beast where, yes, disabling JavaScript means you are opting yourself out of being able to use the app.
Yes yes a thousand times yes! The web exists to enable interaction more than cracking open a book or magazine. If all you’re after is blogs, sure, turn it off.
What is the site?
for like, a blog, I think that’s an ok complaint to have.I suspect it’s this - https://bestestmotherfucking.website/
I also suspect both the site and that message are trolling
Without JS, most webpages couldn’t do 1/10th of what they do. There’d just be text and pictures. OK for fairytale books.
Most websites are just text and pictures…
how wrong you are. Most webpages don’t have to do client side computation. Its all fancy ajax anyway.
Small world-view moment












