hckrnws
JavaScript-heavy approaches are not compatible with long-term performance goals
by luu
This title is very misleading, it should be "Why React is not compatible with long-term performance goals"
And I do agree generally. React uses an outdated rendering method that has now been surpassed by many better frameworks. Svelte/Sveltekit, Vue, and Qwik are the best examples.
People relying on bloated React packages is obviously not great but that is nothing to do with javascript itself.
The JS engines are all relatively fast now. And the document model of the current web provides major accessibility to both humans and search tools like SEO and GEO. JS is not my favorite language. I would rather the web was based on a statically typed language that had better error handling practices like Go. But this will most likely not happen any time soon as it would require every level of the ecosystem to adapt. Browsers, Frameworks, Developers etc.
Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.
If you want faster webapps just switch to sveltekit or vue or qwik. But often the ones choosing the framework for the project have not written much code in years, they know react is as safe option and used by everyone else so they follow along, if it gets slow its a "bug" causing it as they built apps that were "good enough" before using it.
> Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.
They already tried. It was called Dart and for a while there was an experimental flag to enable it directly in Chrome. It was cancelled and Dart was relegated to transpiling to JS/WASM.
There was a complete fork to support it, Dartium.
https://chromium.googlesource.com/dart/dartium/src/
Dart is only around because it was rescued by AdWords team, which had just migrated from GWT into AngularDart, and wasn't happy to do yet another transition.
Most of the key names from Smalltalk/Self background that were part of Dart 1.0 left the team at this point.
It managed to stay around long enough to get the attention of the Flutter team, originally their prototype was done in JavaScript, which is also the reason why Dart 2.0 was a complete reboot of its type system, from dynamically typed with optional typing, to strongly typed language with type inference.
And it isn't as if Flutter is a great success, Fuchsia never made it big, the Android team counter attacked with Kotlin's adoption, JetPack Compose, and Kotlin Multiplatform.
> JetPack Compose, and Kotlin Multiplatform
More like JetBrains dragged Google kicking and screaming into multiplatform, given how piece of a shit setup is.
Google is the only OS vendor that outsources all their development setup, I am surprised that by now they haven't yet acquired JetBrains.
> And it isn't as if Flutter is a great success, Fuchsia never made it big, the Android team counter attacked with Kotlin's adoption, JetPack Compose, and Kotlin Multiplatform.
Sorry but I can't find my way through the double negatives in this (does "isn't as if" apply just to Flutter or the whole sentence?), and I'm not so familiar with that ecosystem that I know myself. Are you saying that Flutter isn't a great success and has instead been superseded by Kotlin? Or that it was a success and that's why Android team counter attacked?
I think I figured out that the parent meant:
> Flutter is not a great success; Fuchsia never made it big, and the Android team counter attacked with Kotlin's adoption, JetPack Compose, and Kotlin Multiplatform.
(Despite being much derided when output by LLMs, a semicolon would've really helped me here!)
Partly my confusion came from having heard of both Flutter and Fuschia but having no idea they were connected. In fact I only knew of Flutter working on Android (and other non-Google OSs).
Anyway, it's true that Flutter is not as widely used as an Android replacement would be (if Fuchsia had turned out to be that) but for a UI framework it's reasonably well used and pretty great at cross mobile/desktop development.
Agreed that Dart was an effort to do exactly this.
To be a bit pedantic... "relegated to transpiling to JS" makes it sound like that was something they had to do later, but in fact that was up front part of the transition plan. The idea was, write your code in Dart and it would work natively in browsers that (eventually) support it but would still work in everything else via JS transpilation. You're right that it was relegated to only working that way once it was rejected.
The problem was (if the Wikipedia article is to be believed) that developers took offense to one company unilaterally deciding on the future scripting language of the web. A little ironic given that's how JS evolved in the first place, but fair enough. Later, WASM was created instead, with the idea that no particular language had to be chosen (and frozen) as the JS replacement. Dart compilation to WASM followed.
The reason I'm a backend dev at the moment is that I looked at the React model and decided I didn't want anything to do with this insanity.
I've been appalled by how long and how broadly the mass hysteria lasted.
And what's crazy is with ai the percentage of apps developed using react in comparison to all other frameworks has INCREASED over the last 3 years.
It is truly mass hysteria, I would say that 95% of developers, project managers, and CTOs do not truly understand how these systems work under the hood, or at the very least are too scared and comfortable to try other systems. They just repeat the same things they hear and tell each other "react has a big ecosystem" "react is industry standard" "everyone uses react" "react was developed by facebook" "we will use react" "Developers only know react, how could we hire for a different framework?"
In my mind its clear that the alternatives are massively better. When i visit certain websites I often get a random tingle that it uses svelte because its way faster and has better loading and navigation than other sites and when i check the devtools I'm almost always correct.
I also get the same feeling sometimes when I hit a laggy slow webapp and I open the devtools and clearly see its a nextjs app.
A dev team could be trained on svelte or vue in literally 3 days, and so long as they actually understand how HTML, JS, and css work under the hood they would increase their speed massively. People just don't want to take the risk.
There's 3x more React libraries and code out there to reference. AI agents do _a lot_ better with React than, say, SolidJS. So productivity > performance is a tradeoff that many companies seem to happily take.
There's so many React libraries because the framew... I mean library is extremely barebones.
That is, in my view, the main issue with React (rendering model notwithstanding): every React application is composed of tens of tiny dependencies and since there are several competing libraries for even basic things like routing, no two projects are the same.
And the end users pay the price!
Umm, are we really taking "being AI friendly" as an account to choose framework now? I’ve seen this mentioned often enough lately that it’s making me uncomfortable. I fear if this will becomes epidemic. This will literally hault progress in improving and adopting better frameworks.
As a solo dev who picked SolidJS, yes, it is a big factor. Things like tldraw, node editors, wysiwyg editors, etc. Having to reimplement them is a huge time sink.
> AI agents do _a lot_ better with React
Do you really think this is a solid argument in this particular discussion (about it being slow and introducing a mass of unnecessary complexity)?
Some people need ways to cope with the drawbacks of what they use, even if irrelevant to the conversation.
To add, the other dominant force is Angular, very popular in enterprise settings. I don't understand why per se, it's very... verbose, every component needing a number of files. It (and also React) gets worse if you add a state system like RxJS.
My theory is that it's popular with back-end / OOP / Java / C# developers, because they're used to that kind of boilerplate and the boilerplate / number of files / separation of concerns gives them comfort and makes it "feel" good.
But also, like React, it's easier to find developers for it, and on a higher level this is more important than runtime performance etc. And continuity beats performance.
(I can't speak of its runtime performance or bundle size though)
Angular has been my main framework for the past 9 years, so I guess I can shed some light on this:
AngularJS came before React and being a Google product, gained enough inertia in large organizations that given the choice, the decision makers preferred to migrate to Angular 2+, which at the very least had a passing similarity to the previous iteration, instead of jumping to a wholly new framework like React.
The very last AngularJS to Angular 2+ migration I participated in happened in 2020, when it was already known that the former would reach end of life by the end of 2021. That is how reluctant corporations are to rewriting the entire frontend codebase.
Mind you, I've used other frameworks like Vue in an Angular-oriented company, but the project was: greenfield, small and underfunded, so we had some more liberty in choosing what to use.
This whole thing is the problem. AngularJS was released in 2010. If in 2010 I'd know that the damn thing would die in 2021, and that I would have to rewrite it all by that date, I would not have used the damn thing in the first place.
I also at some point inherited an app written in Vue 2. By the time I got it, Vue 3 was already out and a couple of years later, Vue 4, completely different to Vue 2, was out. Rewriting was not an option, so I had to create a docker image that can compile the damn thing offline, cause if some part of the supply chain breaks, well, that's it.
Ten or eleven years is not a super long time in enterprise software. Having to keep upgrading and changing libraries just cause the devs of the libraries get bored should not be a thing.
>> To add, the other dominant force is Angular, very popular in enterprise settings.
I work at a fortune 200 company. Worked in development for 5 years before moving into accessibility. I still have good friends who are still working on dev teams here. When I left, people were saying how shitty Angular was and how React was the way of the future.
5 years later and they're rebuilding several enterprise apps that were built with React and going back to Angular. I guess one of the directors, in an all hands team meeting basically said, "The ReactJS experiment is officially OVER."
Its anecdotal evidence, but it seems large projects and teams are going back to Angular for familiarity, ease of debugging and maintainability - not necessarily speed and performance. Which when I was still in dev circles, that was all that mattered.
It kind of feels like another changing of the guard in the industry right now.
> ease of debugging
I had thought to start trying to contribute to the Grafana project and it turns out the whole frontend is React and some completely impenetrable flavor of it to boot. How do people work like this? It’s insane.
My current company uses Angular, and uses it reasonably well. Prior to working there, I'd never used Angular, so I feel well-equipped to comment on this.
We've also recently absorbed a vanilla(-ish) JS codebase largely developed by one dev which provides a point of comparison.
Angular has plenty of boilerplate and idiosyncrasy, but it being opinionated and "pattern-y" has advantages when you want devs who infrequently touch it to be able to jump in and make changes with some level of consistency.
Additionally -- and this is anecdotal, but I suspect it's a common part of working with a plain JS codebase -- tracking the flow of data through the Angular application is usually soooo much more straightforward, even when it involves navigating through many layers and files. The Angular codebase only has N types of things, and they tend to relate to each other in the same ways, and they're often quite explicit (such as explicitly defined component element names in HTML templates). In contrast the JS app has whatever it has built up with very few constraints keeping it consistent. Obviously that could be improved with discipline and structure, but reducing that requirement is one of the things a framework gets you.
I can't comment too much on React as my one experience working in a proper React codebase was in a team who were largely as clueless as me :)
My experience of working with Angular (admittedly several years ago when Angular versions 2-4 were current) was that it was pretty close to the React model (which was a huge improvement over the old angular.js), but also slightly worse than it in pretty much every area.
Most of the problems were the framework being too opinionated and not getting out of your way, whereas it is generally easy to "break out" of the abstraction when needed with React.
> It (and also React) gets worse if you add a state system like RxJS.
I don't understand this sentiment.
Client-side applications often have a need for global state. It could be as simple as whether the user is logged in, or as complex as, say, the shapes that you have drawn in excalidraw canvas, and the drawing tool that you have selected. State changes over time, and so it seemed plausible to model it as a stream of states over time (hence rxjs). There are more robust ways to deal with the state, which is why angular has now moved to signals; but what is so bad about the state?
RxJS is far more robust than signals (things like backpressure operators, scheduling operators, and more). Angular's move to signals is just another sign of what Angular was doing wrong with RxJS all along: Angular could have used the robustness of RxJS to make a better framework but at many decisions along the way chose to not trust RxJS to do its jobs, skipped operators for the sake of leaking imperative state back out into the world and/or treating RxJS like heavyweight Promises (which indeed they are heavy and unweildy in that context). Signals enshrine the imperative leaks as the preferred and final path forward, further crippling the robustness RxJS could have offered and yet also doesn't entirely eliminate RxJS as a dependency.
Angular was born broken in how it made use of RxJS. A lot of developers have a bad impression of RxJS simply because Angular led them to so many RxJS worst practices, bad performance, and taught them that RxJS was just "heavy Promises".
RxJS has an intrinsic diamond problem (multiple events fired when several sources update simultaneously), which led to occasional glitches. Signals are guaranteed not to have that problem. This is a strong argument in their favor.
But yeah; I am amazed and amused by how, during the past couple of years, the interest to rxjs has plummeted. No more conference talks on this topic. And even though Observables have been released natively, there is no activity in the rxjs repo to update the library accordingly.
As with most RxJS problems the diamond dependency problem can often be fixed/solved with the right combination of even more advanced operators/combinators. Often the easiest solution is `combineLatest`, but as mention that will over-fire depending on your update patterns. Sometimes a simple `throttle` is all the backpressure fix you need on that to reduce diamond over-firing, especially in combination with a cool scheduler such as `animationFrameScheduler` [0]. However there are also other domain-specific replacement options for `combineLatest` if you know the general event firing pattern such as `withLatestFrom`, `zip` [1], `join` [2], or `and/thenDo/when` [3] (in increasing complexity). (ETA: Not to mention that many diamond dependency problems are just fan out problems that can be solved by refactoring computations to an earlier point in the pipeline before the fan out rather than trying to fan back in.)
> And even though Observables have been released natively, there is no activity in the rxjs repo to update the library accordingly.
I believe this is still reversed: "native" Observables, the WICG Observables Proposal [4] is still marked as a Draft Group Report (not even officially a Proposal, still, ouch). The Issues of the RxJS repo still seem to indicate that everything is still in a holding pattern waiting for that to move.
I still cynically think this is yet another case of WICG holding the football in front of RxJS and just as likely as Lucy would do to Charlie Brown, will pull that football again right as they try to kick it, again. The WICG has acted like there is renewed interest in Native Observables but my cynicism thinks they are just rehearing drafts on it to appease the Observables crowd just long enough to claim more interest/easier implementation in Native Signals and try to get a Native Signals Proposal passed without good Observables interop. (But again, I'm a cynic here.)
[0] https://rxjs.dev/api/index/const/animationFrameScheduler
[1] https://reactivex.io/documentation/operators/zip.html
[2] https://reactivex.io/documentation/operators/join.html
[3] https://reactivex.io/documentation/operators/and-then-when.h...
RXJS is also 18 MB which is irritating in a backend but completely bonkers in a frontend.
The entire npm package is 4.5 MBs uncompressed, and that includes a lot of files that won't be included in a modern bundle. A simple `npx esbuild --bundle ./node_modules/rxjs/dist/esm/index.js --outfile=./vendor/rxjs.js --format=esm` gives me a 141 kb file before compression.
If you are seeing 18 MBs in your bundle from npm you may have more than copy or something else wrong.
I will complain that RxJS still doesn't have a proper ESM build in 2026 for even better tree-shaking because that is apparently waiting on the WICG Observables "Proposal" to complete, and cynically I still don't think it is going to happen and I would love real ESM from RxJS someday like yesterday.
> RXJS is also 18 MB
This is impossible. I think you are off by about a factor of 1000.
Like all languages, C#, Java, etc have cargo-cultist developers who use them badly, and in the case OOP languages, doing things like overdoing the separation of concerns.
However, speaking as someone who actually uses C# day in and day out and understands the trade-offs of different levels of (or even no) separation of concerns, it's not done for us to "feel" good.
On very large projects running for a very long time with many developers coming and going with varying levels of competency, an appropriate amount of separation of concerns and decent "plumbing" can be the difference between a code base enduring decades or becoming essentially unmaintainable due to unchecked entropy and complexity.
It is very hard to avoid when you have Vercel doing partnerships with SaaS cloud providers that end up supporting only React and Next.js on their SDKs.
Even if other programming stacks are supported, they tend to be 2nd or 3rd class versus anything React.
Look into the ecosystem of headless products that have jumped into the whole MACH architecture hype cycle.
Then you have to justify to upper management, and partner support when problems arise why you are not using their SDK.
Why on earth would your cloud provider dictate your front end stack? My mind is in tatters trying to understand this thought process.
Because they have these things called SDKs to extend their product, for specific programing languages.
Here is an example, https://www.sanity.io/studio
Want to extend the SaaS product for your backoffice users?
Learn React or make Studio from scratch.
Frontend stacks are also backend stacks in cloud products.
This looks like a SaaS vendor not a cloud provider?
I wrote "SaaS cloud providers", I am yet to see a Saas solution on-prem.
Because with Next.js, Vercel was able to turn the frontend stack into also a really shitty backend stack. And it's particularly shitty at being deployed, so they're in the business of doing that for you.
People have no idea who they are hiring, so they hire people who have no idea what they are doing.
> they actually understand how HTML, JS, and css work under the hood they would increase their speed massively.
Indeed, Svelte is built directly on web standards so knowing it strengthens your command of core JS and the web-platform itself - not just a silo-specific layer like React.
Also Svelte acts as a compiler: it outputs just clean, highly optimized JavaScript with no virtual DOM, that uses native web-APIs and results in minimal runtime overhead, smaller bundle sizes and overall excellent performance and DX.
I managed to avoid learning React long enough to become a manager.
I'm not saying I won't ever end up in engineering and won't ever have to learn it, but at least right now, it feels kinda like I got away with something.
It lasted because React genuinely did introduce a useful new paradigm - of UI being a (mostly) pure function of state. This has actual benefits. Unfortunately the way this was implemented was (very) suboptimal, and now they're locked in.
I agree that this stubborn clutching to that original idea, despite better options that acheive the same paradigm having arisen in the meantime.
Unfortunely it is hard to avoid when doing backed development with SaaS, iPaaS products, where JavaScript is in many cases the only extension language for backend widgets.
I am kind of alright with Next.js, as it is quite similar to using SSR with more tradicionla backend stacks.
Also at least node's C++ AddOns give me an excuse to occasionally throw C++ into the mix.
I remained unemployed after I was laid off until a change of career happened. I refused to back to Angular/React/Vue. I was not going to shovel shit anymore for people who were clearly incompetent and really entitled about it.
Where did you career change take you instead?
> Google would have to take the lead and implement this in chrome then enough developers would have to build sites using it and force safari and firefox to comply. It just isn't feasible.
This is not something you really want to happen for the health of the web tech ecosystem. I am surprised to see actual developers nonchalantly suggesting this. A type system for the web is not worth an IE 2.0
I would want this to happen personally as there is no other way that the tech can advance at this current point.
It's clear that there is no global browser federation that works together to make standards. Every browser does things themselves and implements things differently. So the only way that it is feasible is if the browser that is used by 75%+ of sessions is the one that implements it.
Would I like it to be open source? of course. would I like it to be a separate project that was not directly controlled by google? yes. But this does not change the fact that it would ONLY be possible if they accept it into chrome.
The alternative is to forever use Javascript and never advance or change things.
Tech moves quickly and it could be possible that a new browser could take market share in 10 years but it is inconceivable now and would take some groundbreaking shift.
It feels like current browsers can never change their language and it will only be once a new platform is standard, like Vision/Glasses or a voice AI assistant.
It could have been possible if Huawei was allowed to stay in the US market as they are making their own kernal, operating systems and browsers, but they are now excluded. So only google and apple are present to control how things operate.
Over time things must adapt if they become better, there is always pushback when change occurs, but change is necessary for growth.
Almost had this with dart
> The JS engines are all relatively fast now.
That's only if you're limiting yourself to laptops. In the smartphone space, the median device (even in the US) is an Android running 2-4 Mediatek cores with a 10-year-old design. On those devices React is unuseable: it can take 5-10 seconds just for initial parsing of all the React code.
Didn't the React team present an exploration of Performance?
- https://www.youtube.com/watch?v=uAmRtE52mYk
It showed that it is fast already, faster with the compiler, and could even be way faster than signal based approaches, and even faster if it dropped some side-effects it does. Pulling from memory here.
it's not a really unbiased view. React's team has made very clear that they won't even start considering a different approach, and this (and other) is just their narrative to support their stance (which never seemed under discussion), and it borders on gaslighting the community which still has to deal at least with the two giant issues of:
1) knowing why something rendered is almost impossible 2) even for skilled developers who know what they are doing it's quite easy to introduce performance regressions. In other words, it's not a pit of success performance wise
Meanwhile (and this is also never addressed by the React team) if you use other frameworks some issues (for one: effect dependencies) simply are not issues
Why is Vue, Svelte, Qwik etc. faster?
React done poorly can be slow. Bad code can just force the virtual DOM to diff unnecessarily 1000s of times and your app will still work fine for the most part. That said, if you are intelligent about when you force React to reconcile state changes with the DOM it is very efficient.
All of these frameworks purpose is to synchronize the state of your app with the state of the DOM. There is no getting around that, that is why they exist. Vue uses a virtual DOM with diffing as well, I am not sure how Svelte/Qwik do it but I imagine they have a similar mechanism? Like you can't get around the fact that at the end of the day all of these frameworks exist to say: "The data in the users app updated, what DOM updates do we need to make to reflect that?".
Anyway that was a tangent but, I think React is perceived as "slow" or buggy because the vDOM & reconciler are so flexible that it allows people to do extremely dumb things while coding and it will still work.
If you don't understand why these are faster then you need to look at the source code and benchmarks.
Qwik is faster because it splits every single reactive element into a loader that only hydrates exactly what is needed when its used, that's why its good on low ping environemnts but can be very bad on high ping environments regarless of DL speed.
Svelte is faster because it uses signals under the hood, has tons of optimizations, doesn't use VDOM, and has a much better diffing algorithm to only update what is needed.
Yes you can make react "fast enough" noone is denying that. Its used by millions of websites. That does not change the fact that all of these other frameworks are faster than it at doing the same operations/dif changes.
Do you consider Angular to have a better rendering system? Or is it similar to React?
Asking because I use Angular and want to learn other frameworks in case Angular is just as bad for long term.
I analyzed all the major frameworks years ago and went with Svelte when svelte 5 came out. It is all I use now, the development team is highly responsive and cooperative with the community, the founder Rich Harris has a pragmatic approach and tries to keep it as compatible with html/js while still providing great DX and massive efficiency. It natively supports serverside rendering that turns into a SPA once it hits the browser.
Angular is probably more efficient than react but The DX is going to be worse than svelte or vue.
If you want the fastest apps in environments with low ping then use Qwik.
If you want the best overall experience, great customizability, compatability with any js package, great efficiency in most situations, and a great developer experience just go with Svelte (or vue maybe).
Angular might be fine, I don't know. I never used it extensively. But I do know that svelte is the only framework that I like using now.
Angular uses directives for rendering, that allows compilers to optimize rendering.
React model uses plain JavaScript expressions which are harder to optimize.
Do some research, experiment. Make up your own mind.
Asking people / reading opinions on the internet is part of research though.
I keep enjoying SSR with Java and .NET frameworks as much as possible since the 2000's, no need for Go alone.
React is alright, when packaged as part of Next.js, which basically looks like React, while in practice it is SSR with JavaScript.
> React uses an outdated rendering method that has now been surpassed by many better frameworks. Svelte/Sveltekit, Vue, and Qwik are the best examples.
I strongly disagree with this. Svelte/Solid/Vue all become a twisted mess eventually. And by "eventually" I mean "very very soon".
The idea to use proxies and automatic dependency discovery looks good from the outside, but it easily leads to models that have random hidden interdependencies.
React's rendering model is simplistic ("just render everything and then diff the nodes"), but it's comprehensible and magic-less. Everything is explicit, with only contexts/providers providing any "spooky action at a distance".
And the recent React versions with Suspense neatly add the missing parts for async query/effects integration.
> If you want faster webapps just switch to sveltekit or vue or qwik.
If you want even worse webapps then switch to Vue and forgo being able to ever maintain them.
I've worked on large React and Solid codebases and don't agree at all. You can make a mess of either one if you don't follow good practices. Also dynamic dependency management is not just a nice to have, it's actually critical to why Solid's reactive system is more performant. Take a simple example of a useMemo/createMemo which contains conditional logic based on a reactive variable, in one branch a calculation is done that references a lot of reactive state while the other branch doesn't. In React, the callback will constantly be re-executed when the state changes even if it is not being actively used in the calculation, while this is not the case in Solid because dependencies are tracked at runtime.
> Take a simple example of a useMemo/createMemo which contains conditional logic based on a reactive variable, in one branch a calculation is done that references a lot of reactive state while the other branch doesn't.
Then you create two useMemos, and recalculate based on the branch. Or you can cache the calculation in a useRef variable and just not re-do it inside the memo callback. There's also React Compiler that attempts to do the static analysis for automatic memoification.
> it's actually critical to why Solid's reactive system is more performant.
I've yet to see a large Solid/Vue project that is more performant than React.
> In React, the callback will constantly be re-executed when the state changes even if it is not being actively used in the calculation, while this is not the case in Solid because dependencies are tracked at runtime.
And in Solid the dependency tracker can't track _all_ the dependencies, so you end up using "ref.value" all the time. Often leading to the code that that has a parallel type system just for reactivity. While in React, you just use regular objects.
Vue's model is more constrained: sharing variables to child components, signals to notify parent components. I found this a lot simpler to reason about than React.
Saying that go has good error handling is… bold.
Letting google implement a new language for the web would probably result in something even worse than javascript.
I wouldn't call it good, but Go was designed for applications performing large numbers of concurrent processes. It is really easy to just wrap a block of imperative code in a try/catch, but it becomes much more difficult when you have a bunch of different goroutines communicating over channels producing outputs at indeterminate times. Like you need bespoke error handling for every use case, you can't just say: "400 try again later".
Python has async and try blocks, what's the problem with having both exactly?
> Letting google implement a new language for the web would probably result in something even worse than javascript.
Why? Javascript was famously written in ten days, was intended for casual scripting, and has not made up its mind whether it wants to be a Java or a Scheme. What could google invent that would be worse?
Because go is a poorly designed language as well, with terrible error handling.
I'm tired of HN complaining about react for dumb reasons. To be clear, performance and bundle size matters for some things - if that's your complaint, whatever, i have no issue. My issue is there are whole host of use cases for which React is perfect, it has a functional declarative style which favours human understanding and correctness. This is a good thing. This is what the core of modern React gives you with functional components and hooks. You can do a lot with the core these days. Svelte and Vue are inferior for all the reasons React is functional and those other choices are imperative. I could go on but stop trying to make Svelte a thing, no one is buying.
I come from the native desktop/mobile world more so than from web, but I’d strongly contest the idea that declarative/functional is always better than imperative/OOP.
My experience is that declarative starts getting mind-bendy and awkward past a certain point of complexity. It’s nice when it’s simple enough for everything to fit on a single screen without too much scrolling, but past that you need to start breaking it out into separate files. That’s not too bad initially, but eventually you end up with an infinite-Matryoshka-doll setup which sucks to navigate for anybody who doesn’t know the codebase because they have to jump through 7 files to drill down to the code they’re actually interested in and makes it more difficult to get a thousand-foot view of it all.
Also, the larger the project the more likely it is that you’re going to have to pull off acrobatics to get the desired behavior due to the model not cleanly fitting a number of situations.
Declarative is solid for small-to-medium projects, but for more “serious” desktop class software with complex views and lots of panes and such I’d be reaching for an imperative framework every time. Those require more boilerplate and have their own pitfalls, but scale much better on average for a dev team with a little discipline and proper code hygiene.
I like functional code, but part of the issue with React in that regard is that it likes to hide state from you.
Eg all those providers that are inputs to your component but get hidden away.
It really diminishes the value proposition of functional code because the implicit inputs make the output feel non-deterministic. Eg you can have bugs because of state in a provider that is a transitive dependency of a downstream component. You never pass it through explicitly, so it’s not readily apparent that that state is important and needs to be tested.
I find imperative to be a mess because of bugs in teardown (eg someone added a div, that div isn’t properly removed when re-rendering, problems only appear when there’s 3 or more left over divs). Unless you tear everything down and rebuild it on view change, which sounds pretty functional to me (though probably not pure, but what is outside of Haskell?)
I could see where imperative might be more of a mess on the web compared to other platforms, but there’s probably ways to alleviate that. One that comes to mind is never using bare HTML primitives and only ever using components with proper teardown/management built in, of which could be enforced with some combo of linters and tests.
EDIT: Thinking about this some more, I suspect that intermingling of low level primitives and high level components is the root of a lot of problems on the web. It’s convenient in the moment but the mismatch in models is a recipe for trouble.
Alright, I'll bite. What about imperative programming fixes the scaling issues you're describing?
Typically programming scale is regarded as a benefit of FP over imperative programming, as the lack of mutable state results in less moving parts to reason about.
Part of it is solved by how longer files are much more readable in imperative setups and can benefit from things like an editor/IDE being more able to jump around within the file since it’s not a big tangled up deeply nested chunk and developers can add things like titled section dividers for the editor to latch onto.
And no language will fix all these woes on its own, but problems take a lot longer to crop up with a disciplined team in an imperative app compared with a declarative one.
FP probably does scale better when we’re talking about the nuts and bolts under the hood, but I don’t believe it’s well suited for UI except in the simplest of cases. There’s too much in that domain that’s at odds with the nature of a declarative/functional approach.
React is not actually functional - hooks are not functions, they're magic functions that smuggle state.
The main problem with react is that it's, ironically, not reactive. It greedily re-renders, whereas Vue actually reacts to state change.
React's model is fine, it's just the wrong language. React implemented in Rust for example can be much faster, although you pay the cost differently for wasm <-> js communication.
I've skipped the whole 'modern' web stack. I've stayed SSR first, hrefs/forms/url as only routing primitives, progressive enhancement, vanillajs-islands only where absolutely necessary. Its great. Apps never randomly break. No build hell. UX easy to debug (just look at the HTML). No random performance degradations. No client has ever said it feels dated. Just finished my first app-like PWA, also SSR. Getting great compliments on the UI and slick interactions using just native browser transitions. The vanilla web stack gets better and better! Honestly don't know what people think they are gaining with a heavy frontend.
Likewise, I never liked JS much, nor the frontend dev experience.
I started out with the Seaside framework, but I've done several variations on that theme in different languages along the way.
It goes something like this: A typed server side DOM with support for native callbacks, generates HTML and hooks up callbacks. Changes are submitted to the server similar to traditional HTML forms, but using JSON. Changes to the DOM generate JS that's returned from the submit.
One headache with this approach is that documents need to stick around or callbacks will fail, and you need to hit the same server to get the document.
It should be doable to put a serialized version of the DOM on the client and passing that to callbacks to be rebuilt on the server.
> Honestly don't know what people think they are gaining with a heavy frontend.
True, but (I repeat myself here), it depends on what kind of website we are talking about. For instance, a data-heavy SPA that workers use the whole day (like a CRM) is at least perceptually faster and more user friendly compared to the same thing but with traditional whole page reloads.There's plenty of middle ground here, you don't need fancy frameworks to do partial reloads.
I am open for suggestions, but anything wanting to give a desktop like experience is going to be complex. Like the user clicks a button, now widget a1 » a1.3 » a1.3.2 » a1.3.2.2 should be in an "open state", while widget b1 » b1.2 » b1.2.1 needs to be in "disabled state" and widget c3 » c4 » c5 shows a status message.
Sure, and the further you go in that direction, the more you're building a traditional desktop GUI experience, which was always a bad fit for the web.
So yes, if that's really what you want to do, React or similar is probably what you want.
If.
A cogent article. But I think the biggest problem is that the DOM was built for documents, not apps. We know how to build a performant UI architecture: Qt, Java/Swing, Cocoa all have pretty similar architectures and they all ran fine on much poorer hardware than a modern browser on an M1. But unless you use WebAssembly, you can't actually use them on the browser.
When the industry shoehorns something into a tool designed for something else, yeah, performance suffers and you get a lot of framework churn with people trying to figure out how to elegantly cut steaks with spoons.
But most apps are documents, they are built to render data and text fields in a nice way for the consumer to use.
You most certainly shouldn't be building graphs with table elements but JS has canvas and svg which make vectors pretty efficient to render.
The document model provides good accessibility and the ability for things like SEO and GEO to exist.
If you are making a racing simulator, then using HTML in no way makes sense, but for the apps that most of us use documents make sense.
It would be nice if browsers implemented a new interpreted statically typed language with direct canvas/viewport rendering that was more efficient than javascript, but chrome would need to adopt it, then developers would need to actually build things with it. It seems like it would currently have to come from within the chrome team directly and they are the only ones that can control something like this.
The irony is that CSS works fairly okay for the small number of UI elements and web games that are decidedly not documents. Or perhaps that's not so much irony as filling in the gaps.
fairy okay is the key word.
Everything works fairly okay on modern hardware. I'm sure someone could build a 3d rendering engine using only table elements and css and it would run decently well.
There are hundreds of tools in the belt, people can use any of them to tighten down the screw, but it doesn't mean that they are the most efficient or best to use.
I would also say that a lot of web games are closer to documents than you think. A chess board could be seen as a document, it has tables and rows, the characters are just shaped different than the characters we write with.
Something like a racing sim again could be implemented in css but someone who actually understands how to use canvas is going to have a more efficient way to represent it.
While I don't have the performance bottleneck numbers of React, I don't think it's about Javascript vs. WASM here.
I've seen/built some large Qt/QML applications with so much javascript and they all performed much better than your average React webapp. In fact the V8 / other browser Javascript engines also have JIT while the QML engine didn't.
Comparing QtQuick/QML + JS to HTML + JS - both GPU accelerated scenegraphs, you should get similar performance in both. But in reality it is rarely the case. I suspect it might be the whole document oriented text layout and css rules, along with React using a virtual DOM and a lot of other dependencies to give us an abstraction layer.
I'd love to know more about this from someone who did an in depth profiling of the same/similar apps on something like QtQuick vs. React.
It's not about Javascript vs. WASM; it's the DOM. DOMless apps like Figma are much faster.
The slowness in a React app is not the DOM. Typical apps spend most of their time running component code long before any DOM mutations are committed.
If you look at DOM benchmarks it's extremely fast. Slow web pages come from slow layers on top.
It depends on the app. If the app has many hundreds or even thousands of DOM elements, there's no way to make that work smoothly with React or any other library or framework. Then a canvas based solution can fix things. I know, I've had to fix slow React apps.
You can easily prove if the DOM is a performance bottleneck. DOM performance means one of two things: lookup/access speed and render speed. Render speed is only a DOM concern with regard to quantity of nodes and node layering though, as the visual painting to display is a GPU concern.
To test DOM access speed simply compare processing speed during test automation of a large single page app with all DOM references cached to variables versus the same application with no such caching. I have done this and there is a performance difference, but that performance difference cannot be noticed until other areas of the application are very well optimized for performance.
I have also tested performance of node layering and it’s also not what most people think. To do this I used an application that was like desktop UI with many windows that can dragged around each with their own internal content. I found things slowed down considerably when the page had over 10000 nodes displayed across a hundred or so windows. I found this was slower than equivalent desktop environments outside the browser, but not by much.
Most people seem to form unmeasured opinions of DOM performance that do not hold up under tests. Likewise they also fail to optimize when they have the opportunity to do so. In many cases there is active hostility against optimizations that challenge a favorite framework or code pattern.
> To test DOM access speed simply compare processing speed during test automation of a large single page app with all DOM references cached to variables versus the same application with no such caching. I have done this and there is a performance difference, but that performance difference cannot be noticed until other areas of the application are very well optimized for performance.
This was my instinct. Remember, the author is a hammer. His role is to find performance issues in the frontend. His role is not to find performance bottlenecks in the whole system.
> I think the biggest problem is that the DOM was built for documents, not apps.
The world wide web was invented in 1989, Javascript was released in 1995, and the term "web application" was coined in 1999. In other words: the web has been an application platform for most of its existence. It's wrong to say at this point that any part of it was primarily designed to serve documents, unless you completely ignore all of the design work that has happened for the past 25 years.
Now, whether it was designed well is another issue...
> the biggest problem is that the DOM was built for documents, not apps
I don't see the difference. They're both text and graphics laid out in a variable-sized nested containers.
And apps today make use all the same fancy stuff documents do. Fonts, vector icons, graphics, rounded corners, multilingual text including RTL, drop shadows, layers, transparency, and so forth.
Maybe you think they shouldn't. But they do. Of all the problems with apps in web pages, the DOM feels like the least of it.
> Java/Swing
> performant UI architecture
Not sure if it’s a joke or something.
It very possible to make lightning-fast React web UIs. DOM sucks, but modern computers are insanely fast, and browsers, insanely optimized. It is also very possible to make sluggish-feeling Qt or Swing applications; I've seen a number.
It mostly takes some thinking about immediate reaction, about "negligibly short" operations introducing non-negligible, noticeable delays. Anything not related to rendering should be made async, and even that should be made as fast as possible. This is to say nothing of avoiding reflows, repeated redraws, etc.
In short, sloppy GUI code feels sluggish, no matter what tools you use.
> but modern computers are insanely fast, and browsers, insanely optimized
I think these facts have been used as excuses to shit up the app layer with slow mal-optimized js code.
An example of a recent high performance app is figma, which blows normal js only apps out of the water. And it does so by using c++ wasm/webGPU for its more domanding parts, which is most of it
I think we have to let go of the "just get more ram" approach and start optimizing webapp code like figma does
Exactly my point: if we're not sloppy, we can achieve amazing performance.
> I think we have to let go of the "just get more ram" approach and start optimizing webapp code like figma does
I think you're underselling how much work went into making Figma that way. You're talking about a completely different realm of optimization that most companies won't spend money doing, and most web programmers won't know how to do.
As long as there's no incentive to do otherwise, companies will slap together whatever they can and ship as many features as possible.
True. This is charlie munger's "show me the incentives and i'll show you the outcomes" at play
I would like more apps to be optimized like figma. I wonder what how incentives can be realligned to get companies to do that
For most apps React is never going to be an issue when it comes to performance, unless you use it wrong or you use it for something that is not standard, like rendering fractals. It makes sense to analyse performance if your website is meant to reach absolutely everyone, like old smartphones. There's the issue that a single-page app can be bloated from the start, but that's also in the using it wrong category
> modern computers are insanely fast, and browsers, insanely optimized.
You made the reverse point: If you need modern hardware to run a react app with the same performance as a svelte/vue/solid app on low-hardware, something is fundamentally wrong.
> Anything not related to rendering should be made async,
The thing with DOM interaction is that if you try to make it synchronous then it gets really fucking slow (reflow and friends). So you want it linearized for sanity reasons, but probably not sync.
Not everyone can afford the “modern computers” you are talking about.
This is the first sane comment in this entire comment section. So much nonsense in here, but I guess I should expect that when JavaScript is in the post title.
To me, the main problem is that inevitably, any SPA with dozens of contributors will grow into a multi-megabyte-bundle mess.
Preventing it it's extremely hard, because the typical way of developing code is to write a ton of code, add a ton of dependencies, and let the bundler figure it out.
Visualizing the entire codebase in terms of "what imports what" is impossible with tens of thousands of files.
Even when you do splitting via async `import()`, all you need is one PR that imports something in a bad way and it bloats the bundle by hundreds of kilobytes or megabytes, because something that was magically outsourced to an async bundle via the bundler suddenly becomes mandatory in the main bundle via a static import crossing the boundary.
The OP mentions it here:
> It’s often much easier to add things on a top-level component’s context and reuse them throughout the app, than it is to add them only where needed. But doing this means you’re paying the cost before (and whether) you need it.
> It’s much simpler to add something as a synchronous top-level import and force it to be present on every code path, than it is to load it conditionally and deal with the resulting asynchronicity.
> Setting up a bundler to produce one monolithic bundle is trivial, whereas splitting things up per route with shared bundles for the common bits often involves understanding and writing some complex configuration.
You can prevent that by having strong mandatory budgets on every PR, which checks that the bundle size did not grow by more than X kB.
But even then, the accumulation of 100s/1000s of PRs each adding 1KB, bloats the bundle enough to become noticeable eventually.
Perf work is thankless work, and having one perf team trying to keep things at bay while there are dozens of teams shipping features at a fast pace is not gonna cut it.
It makes sense to have separate entry points for landing pages, no need for fancy providers and heavy imports.
In practice people are more than willing to wait for an SPA to load if it works well (figma, gmail/gdocs, discord's browser version used to be pretty good too, and then of course there are the horrible counter-examples AWS/GCP control panels, and so on).
Exactly. To avoid dependency explosion, you need at least one of 1) most libraries built internally 2) great discipline across teams 3) significant performance investment/mandate/enforcement (which likely comes from business requirement eg page load time). I have rarely seen that in my limited experience.
My 2 cents: I am not an experienced React dev, but the React compiler came out recently with React 19, which is supposed to do the same thing as Svelte's - eliminate unnecessary DOM modifications by explicitly tracking which components rely on what state - thus making useMemo() unnecessary.
Since the article still references useMemo(), I wonder how up-to-date the rest of the article is.
React has always been tracking what component relies on what state, independent of the compiler. That is one of the reason the rule-of-hook has to exist - it tracks if a component calls useState() and thus knows the according setState function that manipulates that particular state.
Idk why people claim React is bloat, especially since you can switch to Preact (4kb) most of the time without changes if filesize is an issue for you.
> Idk why people claim React is bloat
Because it's very hard to control it's rendering model. And the fact that multi billion dollar startups and multi trillion dollar companies hiring ivy league charlatans still have bloated low-performing websites written in react (that don't even need react...) clearly states the issues aren't that trivial.
> React has always been tracking what component relies on what state, independent of the compiler.
This still needs a complete parsing and analysis of your components and expressions. Which is why there is no single very performing UI library that can avoid directives.
As someone that spend many days on performance I will tell you that bundle size have minimal impact on performance. In most cases backend dominate any performance discussion.
Apps are slow because caching is missing, naive pagination, poorly written API calls waterfall, missing db indexes, bad data model, database choice or slow serverless environment. Extra 1MB of bundle add maybe 100ms to one time initial load of app that can mitigated trivially with code splitting.
I didn't mention bundle size at all. We're focusing on rendering.
You can write slow, unperforming code in whatever framework and language you like. React is not particular slow, nor particularly bloated compare to competition.
Comment was deleted :(
Because people make shitty React apps and people think React is slow because of it.
It's definitely not slow here. It's within 20% of Svelte. That's not nothing, but it's not the huge monster people claim it is
https://krausest.github.io/js-framework-benchmark/2025/table...
When 99% of React apps (websites tbh, with very few needs for complex reactivity...) including those built by multi billion/trillion $ companies, struggle to produce non-bloated, non-slow, accessible websites, you know that the library is simply too complex to tame.
Of course React can be performant, but still, you're getting a PhD in its internals and hooks to get what you have in vue/nuxt/svelte whatever out of the box.
It does not matter what framework you chose, if you carelessly throw money at something it is not guaranteed to be good or fast. A framework doesn't replace company culture.
Plus, in my experience, PhDs do not make good developers - they often fall into the trap of thinking too abstract vs. being pragmatic.
I say all of this as someone that would rather use Svelte, and doesn't really care what's going on in the React ecosystem. This is purely about its raw performance.
> you're getting a PhD in its internals and hooks to get what you have in vue/nuxt/svelte whatever out of the box.
You can look at React, Svelte, and Vue benchmarks here: https://krausest.github.io/js-framework-benchmark/2025/table...
Outside of Swap Rows, React is like 15% slower than Vue, 30% slower than Svelte. That's not nothing, but it's definitely not as catastrophic as you're implying. And these aren't hyper optimized benchmarks, these are pretty naive implementations.
You know what is actually slow? Alpine JS. It's slow as fuck. But people talk about it like it's some super lean framework.
> When 99% of React apps (websites tbh, with very few needs for complex reactivity...) including those built by multi billion/trillion $ companies, struggle to produce non-bloated, non-slow, accessible websites, you know that the library is simply too complex to tame.
Ignoring the nonsense 99% figure, I don't think entire sites should be React. I am talking about its rendering because that's what all of your comments are about. Not whether an entire site should be generated with just JS.
I can't explain to you what tomfoolery billion dollar companies conduct to get such garbage. I can tell you it's not unique to React. Apple Music is dreadful, and it's Svelte. There were plenty of slow JQuery websites. You are seeing more bad React websites because you are seeing more React websites in general. It's the PHP problem.
useMemo is for maintaining calculations of dependencies between renders, renders generally caused by state changes. React always tries to track state changes, but complicated states (represented by deep objects) can be recalculated too often - not sure if React 19 improves this, don't think so when reading documentation.
on edit: often useMemo is used by devs to cover up mistakes in rendering architecture. I believe React 19 improves stuff so you would not use useMemo, but not sure if it actually makes useMemo not at all useful.
React compiler adds useMemo everywhere, even to your returned templates. It makes useMemo the most common hook in your codebase, and thus very necessary. Just not as necessary to write manually.
> I’ll focus on React and Redux in some of my examples since that is what I have the most experience with, but much of this applies to other frameworks and to JS-heavy approaches in general.
That's not a fair assumption. Frameworks like Svelte, Solid, Vue etc have smaller bundle sizes and rendering speeds that approach the baseline vanilla-js cost.
I'm all for criticising Javascript, but moving everything to the server isn't a real solution either. Instead of slow React renders (50ms?), every interaction is a client-server round trip. The user pays the cost of the paradigm on each interaction instead of upfront with an initial JS payload. Etc.
Yeah this article is only about React. But it makes sense that someone would think this way because many dev's think JS web apps==react only.
The problem is react is "good enough" for most cases and the performance degradations happen slow enough that the devs/project leads don't see it until it's too late and they are already overly invested in there project and switching would be too compliated/costly for them.
Svelte/kit and properly optimized packages solve almost all of the "problems" this article tries to bring up.
To be fair though, even with Svelte et al bundle sizes can balloon quickly when using UI toolkits and other libraries.
Yes but this is again nothing to do with svelte.
This is just how Node works, it's not a compiled language and you are not tree shaking all the components from libraries so this is bound to occur.
You can solve this by making your own libraries or just copying the individual source code for specific components into your project instead of including the whole package.
With sveltekit(and maybe nextjs aswell) you are only serving the necessary files to the client instead of including full library source code.
> Instead of slow React renders (50ms?), every interaction is a client-server round trip.
This is true only if you use zero javascript, which isn't what the article is advocating for (and even with zero javascript there's quite a bit of client-side interactivity built-in to CSS and HTML). Besides: in practice most interactions in an SPA also involve network round trips, in addition to that slow react render.
Plus Redux is horrible for performance, slows things down and overcomplicates everything.
I never understood why state management is overcomplicated in React. For some reason most people use something like Redux which has a very unintuitive API. There are other state management packages available that are so much easier to use and understand, like MobX.
To shine a light on the mystery, before React had (a) hooks and (b) a stable context API and (c) tanstack-query/react-query or GraphQL - state handling WAS a mess. Thats when redux/mobx etc. made more sense. Try to build something with a pre-2019 version of react and you will understand the need.
Redux is an implementation of the elm architecture (Tea) which is used for UI state in a lot of languages and frameworks. JS/TS is just not a very ergonomic language for it so it becomes painful quickly.
Is this 2019?
I think digressions about React dilute the message. Ok, we get it, react bad; but what is the actionable message here? What are the robust alternatives? There is a section on how changes to react (the introduction of hooks and of the concurrent mode) necessitated significant code changes almost amounting to rewrites; but which alternatives, other than vanilla web platform, can promise long-term stability? Which alternatives are better suited for which kinds of applications (roughly, an e-commerce site vs excalidraw)?
I’ve been very happy using SvelteKit for some side projects. At this point I wouldn’t label myself as frontend developer anymore, but when done right, it’s probably the closest thing to modern, performant and reactive that I know of. It also works really well if the JS just breaks.
Github was usable and fast, now it is slow. Guess what changed...
Acquisitionbombed by Microsoft! Instead of usability and performance it has AI now!
It's crazy slow. But it's also a closed source, Microsoft platform for Open Source, so it belongs in the trash anyway.
It was re-written in React and performance was annihilated. The React Virus annexed another victim and we have one more zombie web-site.
Yeah, but some project or program manager sure got a raise for delivering this nonsense "update".
That was 100% a project sold to upper management as a benefit and modernization.
They mentioned React is missing timing information in the devtools profiler. This was actually added in React 19.2 which should be helpful for debugging.
https://react.dev/blog/2025/10/01/react-19-2#performance-tra...
I am the first to admit that SSR worked fine since PHP. But two points I like to make
1) choose boring technology; 2) it is the library, stupid
First question should be: what type of web application are we talking about? If your blog does not render or your e-commerce site stops working without javascript, shame on you indeed.
If we are talking about SPA's, aka those replacing traditional desktop applications, and you want to throw out React, then we have something to discuss. Because, how are you going to replace MUI/MUI(x)? I keep hearing about the coolness of alternatives, and I am certain open to it, but in the end it needs to be not just better in some aspects or even fundamentals, while completely lacking in practical matters. A simple benchmark I keep going back to is to see if $hotness is able to replace [1], which shows you just one component of many in a cohesive system.
I don't contend with the principles in the article, but I do with conclusions like "ripping out react would be the solution".
This seems to be more about React than Javascript (or indirectly about the DOM). React sitting on top a browser-style DOM would be slow in any language, while Javascript itself can be surprisingly fast, especially when sticking to the right subset. It "only" needs a big and complex JS engine to get to that kind of performance.
As an aside, I like their use of blockquotes+details/summary blocks for inserting "afterthoughts/addendums".
It's a nice touch, and it works pretty well. Partly because, as a design choice, it forces you to add the afterthought between paragraphs, so the interruption in reading flow is minimal.
For websites I use regularly, I can identify those which render server side because I trust that when they load on a slow connection my state is still there. When I see a loading icon on heavy client side JS "apps" I anticipate something breaking and me having to reload and re-enter whatever I was working on.
I am so grateful to the author for writing this article. For years I've been fighting a series of small battles with my peers who seem hell-bent on "upgrading" our e-commerce websites by rewriting them in React or another modern framework.
I've held the line, firm in my belief that there is truly no compelling reason for a shopping website to be turned into an SPA.
It's been difficult at times. The hype of new and shiny tools is real. Like the article mentions, a lot of devs don't even know that there is another way to build things for the web. They don't understand that it's not normal to push megabytes of JavaScript to users' browsers, or that displaying some text on a page doesn't have to start with `<React><App/></React>`.
That's terrifying to me.
Articles like this give me hope that no, I'm not losing my mind. Once the current framework fads eventually die out - as they always do - the core web technologies will remain.
I think shopping gets this in spades too because not all shopping sites are meant to be particularly sticky.
It's one thing to browse the catalog at my leisure on gigabit networking, a 5k display and 16 CPU cores. It's another thing when I'm standing in Macy's or Home Depot and they don't quite have the thing I thought they have and I'm on my phone trying to figure out if I can drive half a mile to your store and get it. If you want to poach that sale your site better be fast, rather than sticky.
Let them have React and SSR everything? Didn't the NYT do this?
I've gone all SSR (server-side render) with JSX using Astro or Elysia. If I need frontend logic, I just sprinkle in a little Htmx or roll my own inline js function. It makes debugging 1000% easier and the pagerank scores are usually amazing out of the box.
JavaScript isn't good for performance? Could've told you that 20 years ago.
agree w the long-term perf point, and i'd add: js-heavy apps also tend to hide the 'true' critical path (hydration + data + layout) so teams ship regressions without noticing. do you have a rule of thumb for when a page has crossed the line (tti/cls budgets, max js kB, etc)? and do you see islands/partial hydration as a real middle ground or just a temporary patch?
"Now’s a good time to figure out whether your client-side application should have a server-side aspect to it, to speed up initial renders."
My how the tables have turned!
everything old is new again
This seems like it's more about React. Javascript is fine. React is garbage
In other news: water is wet. I genuinely don't understand how anyone is still pretending otherwise. Server-side rendering is so much easier to deliver in a performant way, yet it feels like it's being increasingly forgotten — or worse, actively dismissed as outdated. Out of convenience, more and more developers keep pushing logic and rendering onto the client, as if the browser were an infinitely capable runtime. The result is exactly what this article describes: bloated bundles, fragile performance, and an endless cycle of optimization that never quite sticks.
Server rendered HTML, htmlf endpoints and JQuery load was always the sweet spot for me - McMaster Carr[0] does the same thing behind the scenes and utterly destroys every "modern" webapp in existence today. Why did everything have to become so hard?
Pure client side rendering is the only way to get max speed with lowest latency possible. With ssr you always have bigger payloads or double network rounds.
You're literally downloading a bunch of stuff first just to do a first paint, versus just sending back already built and styled HTML/CSS. Not only is client-only technically slower, it's also perceptively slower.
That’s a laughable claim. SSR is objectively faster, since the client does nearly zero work other than downloading some assets. If the responses are pre-computed and sitting in server memory waiting for a request to come along, no client side rendering technique can possibly beat that.
Of course there are cases where SSR makes sense, but servers are slow; the network is slow; going back and forth is slow. The browser on modern hardware, however, is very fast. Much faster than the "CPU"s you can get for a reasonable price from data centers/colos. And they're mostly idle and have a ton of memory. Letting them do the work beats SSR. And since the logic must necessarily be the same in both cases, there's no advantage to be gotten there.
If your argument is that having the client do all the work to assemble the DOM is cheaper for you under the constraints you outlined then that is a good argument.
My argument is that I can always get a faster time to paint than you if I have a good cluster of back end services doing all that work instead of offloading it all to the client (which will then round trip back to your “slow servers over a slow network”) anyway to get all the data.
If you don’t care about time to paint under already high client-side load, then just ship another JS app, absolutely. But what you’re describing is how you deliver something as unsatisfying as the current GitHub.com experience.
Still living the early 2000s eh? Pretty much all interactive responsive apps are all 100% client side rendered. Your claim about SSR being objectively faster looks like a personal vendetta against client side rendered apps. Or javascript. Happy days!
It was faster then and it’s still faster now. Of course, you’d have to learn how a computer works to know that I’m right, but that would be a bridge too far for JavaScript casuals! Just add one more library bro! Then you’ll be able to tell if a number is even or odd!
> objectively faster
> provides zero evidence
Some pretty compelling evidence is history: we had dynamic and interactive web pages 20 years ago that were faster on computers that were an order of magnitude slower.
I don’t really need to provide “evidence”. I told you why SSR is faster and tbh idc if your time to paint is trash.
People in this thread hating on React seem to miss the crucial point that 2016 React was a godsend compared to just about every other option available. Vue only picked up steam quite later as React became more and more bloated or had its development forced by Vercel down a certain road. Angular was THE framework to avoid working on and people hated having to define multiple files for a simple component. The timing for React was just right back in the day.
Given how FEs are re-written every 7-10 years there is ample room for other frameworks to knock React off its throne, but before asking "but why not X" you also have to consider that organizations by now have almost a decade of experience building React apps and this plays a major role when deciding on which UI framework to rely on.
Wasn't WebAssembly going to change all this?
Somehow that does not really have been achieved.
DOM access is not in Wasm.
Comment was deleted :(
This is well known. You can have full reactivity without being JS heavy. https://www.youtube.com/watch?v=8W6Lr1hRgXo is a perfect example
Only a single passing mention of web components?
> You can use isolated JS scripts, or other approaches like progressively-enhanced web components
How would one use "progressively enchanced" web components? Maybe I misunderstand the intention behind this statement, but web components are either supported or not. There doesn't seem to be some kind of progression.
Given custom elements are pretty widely supported by browsers now, I assume you are referring to js being turned off.
In terms of designing for that situation - you can follow a pattern where your custom element wraps ( <custom-ele><stdelement></></> ) the element you want to enhance. If js is turned off, then the custom element defaults to rendering it's contents....
https://simonwillison.net/2022/Apr/21/web-components-as-prog...
Yep, that's the ideal approach for decent browsers. A curious caveat is that IE 8 and below will interpret that example HTML as <custom-ele></><stdelement></> (ie. as siblings, not parent and child) and therefore not apply any component-scoped styles. Not ideal.
Of course nobody uses those browsers anymore, the same caveat applies to non-custom HTML5 elements, and the bad behavior has long been preventable with JavaScript [0]. But anyone (else) with an extreme backwards compatibility mindset might consider if they could instead bootstrap from <div class="custom-ele"><stdelement></></> and (if needed and in window) a coordinating MutationObserver.
[0] https://web.archive.org/web/20091031083341/http://diveintoht...
Wonderful article. I do maintenance programming, and many of the problems you mention with typical react apps are also code maintenance nightmares. Managing a large number of fast moving third party dependencies will destroy your developer budget, but devs cant see it because they're "Best Practices"
Why would performance be a goal?
If a native desktop app crashes, your users will gnash their teeth and curse your name. A web app? They shrug and refresh the page. They have been trained to do this for decades now, why would anyone believe this isn't intended behavior?
No one has a reasonable expectation of quality when it comes to the web.
> No one has a reasonable expectation of quality when it comes to the web.
And that's exactly why I want native desktop apps to make a resurgence.
Eh, this argument falls apart for many reasons:
- His main example of bloated client-side dependencies is moment.js, which has been deprecated for five years in favor of smaller libraries and native APIs, and whose principal functionality (the manipulation and display of the user's date/time) isn't possible on the server anyway.
- There's an underlying assumption that server-side code is inherently good, performant, and well crafted. There are footguns in every single language and framework and library ever (he works for WordPress, he should know).
- He's right to point out the pain of React memoization, but the Compiler now does this for you and better than you ever could manually
- Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes, especially if poorly optimized, which easily and immediately dwarf bundle downloads; and slow database queries, which affect server-side code just as much as browser-side code.
> There's an underlying assumption that server-side code is inherently good, performant, and well crafted.
I didn't read it that way. I believe the underlying assumption is that the server-side code won't run in a power-constrained computer, thus having more performance headroom.
> There's an underlying assumption that server-side code is inherently good, performant, and well crafted
To me it’s an assumption that server side code is going to be running on a server. Which is a known quantity and can be profiled to the nth degree. It’s extremely difficult to profile every possible device your site will run on, which is crucial with low powered mobile devices.
> Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes
Not really, no. Large bundle sizes prevent the initialisation of the app, which means the user can’t do anything. By comparison images and videos download asynchronously and get processed on a separate thread. JS bundles also need to be parsed after being downloaded, if you pair a crappy Android phone with an 5G connection the parsing can literally take longer than the download.
> Larger bundle sizes are unfortunate, but they're not the main cause of performance issues. That'd be images and video sizes, especially if poorly optimized, which easily and immediately dwarf bundle downloads; and slow database queries, which affect server-side code just as much as browser-side code.
In network terms JS tends to be downloaded at a higher priority than both images and video so has a larger impact on content appearing
JS is also the primary main thread blocker for most apps / pages… profile any NextJS app and you’ll see what a horrendous impact NextJS has on the main thread and the visitor’s experience
Interop has entered the chat.
Our webapp is nearly instant, and it's built on raw React with some sprinkling of Tanstack (their Local Collection DB is a masterpiece).
And our stack is intentionally client-heavy. We proactively synchronize all the relevant data and keep it in IndexedDB, with cross-tab coordination. The server is used only for mutating operations and for some actions that necessarily require server-side logic.
The issue with dependencies and the package size if valid, but it's also really not a big deal for performance unless you go WAAAY overboard. Or use crappy dependencies (Clerk, I'm looking at you, 2.5Mb for an auth library!?!?).
As for hydration, I found that it often slows down things. It's a valid approach when you're doing query-based apps, but when you have all the data locally, it just makes little sense. It also adds a ton of complexity, because now your code has to automagically run in multiple environments. I don't think it can really ever work reliably and safely.
I was never a big React fan myself. As someone who has used a lot of different JavaScript frameworks over many years, I can say confidently that it's not the best JS framework; especially nowadays due to bloat.
Yet it's better than anything available in any other programming language, on any other platform in existence.
Never bet against JavaScript. People have done this over and over and over again since it was invented. So many haters. It's like every junior dev was born a JavaScript hater and spends most of their career slowly working themselves to a state of 'tolerating JavaScript'.
JS was designed to be future-proof and it kept improving over the years; what happened is it improved much faster than people were able to adjust their emotions about it... These people even preached the JS hate to a whole generation of juniors; some of whom never even experienced JavaScript first-hand. It's been cool to hate JavaScript since I can remember.
JavaScript does have some bad parts, as does any other programming language, but the good parts of JS are better than anything else in existence. People keep trying to build abstractions on top (e.g. TypeScript) to try to distance people from JavaScript, but they keep coming back over and over again... And these people will never admit to themselves that maybe the reason they keep coming back to JavaScript is because it's pretty darn great. Not perfect, but great nonetheless.
It's hilarious that now we have Web Assembly; meaning you can compile any language to run in the browser... But almost nobody is doing that. Do people realize how much work was required to bring Web Assembly to the browser? Everyone knows it exists, it's there for you to use, but nobody is using it! What does that say? Oh, it's because of the bundle size? Common! Look at React, React bundles are so bloated! But they are heavily used! The excuse that JavaScript's success is a result of its browser monopoly is gone!
Enough is enough! The fact is; you probably love JavaScript but you're just too weak to admit it! I think the problem is that non-JS developers have got a big mouth and small hands...
> but nobody is using it! What does that say?
It’s impossible to replace JS with WebAssembly because all state-mutating functions (DOM tree manipulation and events, WebGL rendering, all other IO) is unavailable to WebAssembly. They expect people to do all that using JavaScript glue.
Pretty sure if WebAssembly were designed to replace JS instead of merely supplementing it, we would have little JS left on the web.
> but the good parts of JS are better than anything else in existence
What you talking about?! I can't think of a single thing in Js that I could say is good.
Okay, two big corporations have invested a lot of money and effort into making V8 and TypeScript, and now it's useful. But I don't consider it exactly part of Js.
You lost me at Typescript. Typescript is great not because it abstracts away any javascript functionality (it doesn't), but because it allows IDE integrations (mainly LSP) to better understand your code, enabling go-to-definition, hover docs, autocomplete, semantic highlighting, code actions, inline error messages, etc.
But I agree many people are jumping on the javascript hate train without really understanding the modern web landscape.
Do people really hate JavaScript, or do they just hate the design choices and results that it seems to be correlated with?
At the end of the day I’m using SaaS tools that are apparently written in React and I get astounded but how slow and heavy they are. If you are editing a page on our companies cloud-based wiki, I’ve seen my chrome RAM balloon from 3GB to 16G. A mistake was made somewhere, that I know.
Try this: https://mapjoy.app/, username and pass johnsmith.
All of JS with client side routing, blazing fast.
React is utterly amazing.
[dead]
[dead]
I’m still baffled why this language is the universal browser standard. There should be hundreds competing.
I'm still baffled why you don't see JavaScript is fast, but how page rendered is the issue.
I’m baffled you don’t believe in benchmarks, but, uh.
Speaking about benchmark.
Node vs Lua shows Node faster than Lua. https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
Node vs C, only 4 to 5 times slower. https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
Looking at its root, it is not too bad as a JIT compiled language.
“Only” 5 times slower? Imagine if every website that takes 10 seconds to load instead took 2 seconds…
You know that CPU-bound performance would take a really small fragment of those straw man 10 seconds loading a page, right?
If you think 10 seconds is a strawman, clearly you haven’t used sites like GitHub on a cheap device.
Case in point, github is server side rendered. I don't think it uses any client side framework.
Try opening it with JavaScript disabled and you will see exactly how much is server-side rendered.
But that's not the javascript, it's how it's used (react or whatever)
You can write terrible non-performant code in assembly and C, and perfectly speedy pages in html and javascript (say the page you're on at the moment).
You wouldn't write a new cutting edge video codec in javascript - not one you wanted to use anyway - but given that we had performant guis running on 286s at 12MHz, Javascript being 1/10th the speed of C isn't the problem.
> “Only” 5 times slower? Imagine if every website that takes 10 seconds to load instead took 2 seconds…
All you need to know about the level of knowledge of average JS hater/muh native performance.
life finds a way ?
Lost me at "React is a Framework" assertion. The key difference between a "framework" and a "library" is the inversion of control that exists in a framework.
React is a library - your app still maintains control of application state and drives the main workings of the application. It is just simply using the React library to render that application state.
Fully agree, Inversion of Control is not something which alone defines something as a framework. When defining it along the architectural axis React is architecturally unopinionated thus is has library-level scope, but framework-like control semantics. If framework = anything that uses inversion of control, then a lot of things suddenly become frameworks, including things nobody calls frameworks. One can call it a "rendering framework" but calling it a "web-application framework" is not factually correct.
Crafted by Rajat
Source Code