I am one of those people.
I personally found web development frustrating until I discovered the classical architecture of the web and fell in love with HTML and modern CSS.
But, you might say, as many have done before, HTML is great, but I want to build a web application, not a website.
The delineation between a website and a web application (if there really is one) is not sufficient to determine which technology we should use.
I think if we take the time to think about it, we can agree that there are valid use cases for many different approaches. If I am dealing with a blog or a site with a lot of documentation, it makes sense to render HTML statically so that we can get the benefits of static routing and caching and so much more. This is the reason Jamstack, the “new” way of building websites, is rising in popularity.
And on the other end of the spectrum, there are a lot of use cases where it makes a lot of sense to pull business logic into the client. I would be the last to suggest that it would be a good idea to build an SVG editor in a browser using only server-side rendered HTML.
The majority of applications are not that extreme and will fall somewhere in the middle of this spectrum. In this situation, we as software developers make the decision for a specific technology stack not only based on the business requirements, but also based on what we know and what we are comfortable with.
To be clear: I find it perfecly legitimate to choose a technology stack based on our personal preference as long as we do not endanger the success of the product by doing so. What bothers me is when I have the feeling that someone is arguing that their choice is based on technological superiority when in reality it is just “I like doing things this way”.
I’m cognizant of my bias.