Hacker News new | past | comments | ask | show | jobs | submit login

Rendering on the client makes a lot more sense. By sending the user a template followed by the data necessary to fill it out you only send the structural code once. If the user views 100 pages then that's 99 times you've not had to send the HTML required to display the content. That's good news from a bandwidth point of view, an environmental point of view, and, if you're clever about how you code it, a rendering speed point of view because you only have to replace the content in some DOM nodes rather than repainting the entire template. That can lead to a faster display than swapping in the rendered content for some situations (particularly if the changes are small text elements and you have a shadow DOM available eg reactjs).

If you're doing something very complicated and your users have old or underpowered devices then rendering on the server is sensible, but in the modern web it really isn't appropriate.




> That's good news from a bandwidth point of view, an environmental point of view, and, if you're clever about how you code it, a rendering speed point of view because you only have to replace the content in some DOM nodes rather than repainting the entire template.

It's just intuition on my part, but I think the environmentally friendly option is probably actually sending a couple extra packets (server-side rendering) vs cooking the CPUs of however many clients you have with JS (client-side rendering).

Also, never experienced an application where parsing a blob of JSON and manipulating the innerText and values of HTMLElements with JavaScript was more efficient than setting innerHTML.


It's just intuition on my part, but I think the environmentally friendly option is probably actually sending a couple extra packets (server-side rendering) vs cooking the CPUs of however many clients you have with JS (client-side rendering)

The difference isn't between sending a couple of extra packets versus rendering on the client. It's between rendering on the server PLUS sending a couple of extra packets versus rendering on the client. You have to do the rendering somewhere. Assuming the code that does the rendering is essentially the same whether it's server-side or client-side, the only difference is sending the extra structural layout data when you render on the server. For most sites it'll make no real difference but if you're at Facebook scale I'd guess a couple of extra packets really adds up, especially considering you could cache the client-side templates between sessions.


I should have mentioned the assumption you'd be caching the render on the server. My bad.

So I imagined it as 1 render vs potentially thousands.

Another thing to note: You'd (probably) have to try real hard to find a server side templating solution that would render as slowly and use as much CPU as JavaScript on the client-side.


There's no magic bullet. As soon as your template references user specific (private) data, the caching advantage almost disappears, because each user requires their own cached rendering. You might as well cache it on the client and save the round-trip, and any additional network/server performance issues.

Rendering public, read-only content such as a tweet, however, makes total sense to cache on the server.


JSON is still rendered. It's not "render or send JSON", it's "render a string that contains formatted display output A, or a string that contains structured data B".

Facebook probably requires a lot of per-user effort true. A lot of sites (github for example) probably don't. It's probably limited to a couple areas of the header generally, or an A or B conditional render for an ownership page.

There's always exceptions to every rule though.


I don't think I agree. It's quite possible to make a single-page application with server-side rendering. Just create API endpoints on the server for rendering partials, and then the client-side router/controller need only be capable of replacing pre-rendered sections of pages with new ones. If these partials are caching-friendly, then you save a lot processing both front- and back-end. Sure, rendered partials might be more expensive over the wire than JSON, but that's pretty negligible.


I've done this. I've done Angular. I've done Ajax calls to templates on the client side. I've done Knockout.

Give me client side data binding (2 way!) any day of the week vs sending the partial back from the server side. You invariably end up with a bunch of shitty glue code that's dealing with the actions AND the display.

Using a framework (Angular, Ember, KO, Backbone) all allow you to separate this spaghetti nightmare into manageable chunks.


In enterprise Java there are quite a bit of tools/frameworks designed to do this. While I don't particularly like the one they use at my company (bloated quirky IBM code-gen tool), it makes me see the merits of this approach (everything is basically a single page app & sections of the page are only replaced as necessary -- it is a Portal solution). Between that & seeing the JSF/PrimeFaces components my friend is always messing around with, I see the breakdown as 2 camps really -- People who like to write js vs. people who don't...

For the people who do, pushing data & logic into the client is fun, working with js, working on the actual "single-page app" & updating logic/css/html simultaneously.

For people who are more into static typing / web back-end, & don't particularly like the process of writing js, there is a nice movement toward dynamic pages where the updates are concocted through data-binding & behind-the-scenes ajax (generated by the component libs...). For a lot of enterprise Java devs, the dream of having an ultimate single source of domain logic in the Java layer somewhere lends itself well to this solution. You code typical OO paradigm and just try to integrate those models naturally into the front-end binding.


I like writing JS just fine (okay, maybe not as much as I like writing Scala). And I think that doing a good server-rendered experience would likely require something at least as sophisticated as Backbone. But I just think JS should be concerned mostly with managing client-side UI and continuity.

For the project I'm working on, pages are generally comprised of large blocks of content, and to the extent that there is SPA going on, it's mostly swapping these blocks. I can definitely see the advantage of client-side rendering when model data tends to be spread out across the whole page. But there's no reason one can't have their cake and eat it too.


This is basically what I do sometimes in Rails apps backing some light jQuery frontend. They are traditional web apps with a few pages doing many front end tasks (think about adding, editing, deleting entries in an address book). They are done by sending $.ajax calls to the server. Some of them get JSON responses but some of them just get HTML and replace the DOM of some elements. That HTML is rendered using the very same partials used to generate the page when it was loaded first (think about the <tr>s in that address book's <table>). This is DRY. Obviously having all the templates in the client would be also DRY but I don't want to have some templates in the client and some on the server. It's either all or nothing and, at least for me, it's just easier to do some of those things on the server.


Doing these things on the server also allows you to take advantage of server-side caching. For some applications, you cannot do this as they are too tightly tied to the current user. However, most applications have at least some sections that can be rendered and cached across user segments.


Does this require clients of your server to all support HTML so that they can screen-scrape partials to understand the results of their operations, or does your server support two output formats?

Two output formats is a reasonable solution, but one that you don't have to maintain if you go with a framework (such as Angular) that can render JSON back into HTML. It really boils down to whether you want to support a more complicated server or a more complicated client (or drop support for any clients that aren't yours).


It's a traditional web app, so it works with HTML. We don't plan to have anything else any time soon so the priority was on the time to have the HTML ready to go.

I did a Rails app serving JSON exactly one year ago, to a small Angular front end. It's was a little disheartening to see an empty page loading and then making JSON calls and thinking that by then a server rendered page would have been already fully loaded. That's why I said that it's all or nothing: either a single page app or many html pages. A few single app pages don't do well, unless they are preloaded with server rendered content.


Sending incomplete pages means those of us that don't run javascript[1] don't actually see your site. First impression matter, and when your page is:

    SomeSite

    {{storyTitle}}

    {{storyBody}}
    {{curPage}} of {{numPages}}
...the common interpretation is "broken site". This current fad of being too lazy to implement progressive enhancement is a regression. Rendering on the server so you server up a actual page is trivial, and you can still provide javascript that loads the next pages faster. Serving up only a template (or worse: an empty body tag) is insane.

The usual counter is that "javascript is always available" not only ignores the risks, I suspect the claim is based on bad data. How do you know how many people disable javascript? We aren't going to be in most analytics...

[1] for the numerous security and privacy reasons. Running arbitrary instructions in a Turing complete language is a bottomless pit of problems, and "analytics" is still spyware. Google shouldn't get to build a log of every page we visit.


> How do you know how many people disable javascript?

In this day and age, most businesses don't care about this type of user. I have no sympathy for those who intentionally cripple the web and don't care to cater to them. You aren't worth it; progressive enhancement isn't worth the effort. It's cheaper to presume Javascript and ignore users like you altogether; don't forget this is business, our motives are profit, not doing things "right".


So so misguided - what happens when your javascript request fails? Or a broken build throws a script error? Progressive enhancement makes sure your content is accessible no matter what the conditions.


> what happens when your javascript request fails?

You hit 'refresh'

> Or a broken build throws a script error?

The same thing that happens when a broken build returns a 500---the user can't use that service until the developer fixes it.

Progressive enhancement is a theoretically good idea that---in practice---actually adds a lot of overhead to developers (because every layer of progression is its own UI, with its own user experience and considerations).


"You hit 'refresh'"

_You_ don't, the customer does.

Would you care to name the sites you work on so I can stay away?


"Me" in this context is the customer. I'm talking about pages I see in the wild; when pages can't talk to their backends, they throw a "We crashed; please refresh" dialog, and I do what I'm told. ;)

Pages I write generally try their best to wrap calls that can fail in a reasonable retry envelope with some intelligent discernment of what response codes can be retried (429, the occasional 420 if someone thought Twitter was cute, the VERY occasional 500 if I just happen to know that the service in question is flaky) and only failing the error back to the user if the client can't retry it. In contrast to the non-JavaScript forms-only sites I've used, which tend to just surface their 429s and 500s straight to the user and expect them to know what a "back" button is (and whether it's safe to resend a form in this context), it's a better user experience.

(Incidentally: I do find myself having to re-invent that "retry envelope with a success handler, fail handler, and filter to determine if the response should be retried" boilerplate over and over as I move among frameworks; if anyone's built a smooth request wrapper for that, it'd be nice-to-have).


What happens when the server-side templating code fails? That doesn't seem to be an issue with client vs server side HTML generation.


The administrator gets a report immediately. Also the server environment is much more in control. Been there, done that, the author has a lot of good points. Partial server rendering is rock solid and so much faster. Not looking back.


I use the noscript tag to tell people with JS disabled that the site requires JS, end of interaction. Enable js or go away, I have more important things to do than bother with people who intentionally break their browsers. They aren't the target market and aren't worth catering to.

This is business, not academia, right and wrong are judged by profit and loss and opportunity cost, not by what is ideal given unlimited resources. Work on feature X or double my work so a few people a day who break their browsers can still use the site... one is practical, the other is not.


I actually would like to get HN's take on this. Maybe someone should submit an HN poll. I'm still of the opinion disabling javascript is an extreme measure and those that do it need to come to terms with whatever broken internet experience they get. While I do think it's worth it to display to the enduser something like "Looks like javascript isn't enabled; you'll need to turn it on for this site", is it truly reasonable to spend development time to make your site functional without client-side javascript? Probably depends on your audience. I bet sites that are more commonly accessed through Tor have people more likely to have javascript turned off.

But yeah, anyone here work webdev where your webapp is expected to fully work without client-side javascript?


I've worked on projects where that was the expectation; I've set that expectation for projects.

The reasoning has both been concrete/practical and philosophical (but still practical):

* Mobile processing time is still costly in terms of battery life and performance. And the number of http calls (and their latency) also makes a difference in performance; SPAs tend to have smaller requests but larger numbers of them and it seems to me that's actually the opposite profile of what 3/4G cellular networks are good at. (And while this is all less true on the desktop I'm starting to find it annoying that we're nevertheless finding ways to make things choppy and slow on 2 GHz machines with operations not more complex than scrolling).

* This is more vague, but I find there's a discipline imposed in starting the conception of the app in terms of plain HTML/HTTP that seems to keep things better organized, while projects that start with a focus on a rich/heavy UI devolve into overspecific yet mixed concerns more quickly. This doesn't work for everything, since some apps just aren't about resources and media types. But honestly, your app probably is. :)

* Being able to debug/autotest with something like curl is pretty nice.


Nope. To me complaining that an app doesn't work with JS disabled is like complaining that the layout is busted with CSS disabled. It's not clear to me why I should worry about this case as a developer. You can turn shit off if you want to but don't complain that it doesn't work now.


I don't make that many web apps, mostly websites for clients, and I worry over IE8/9 users more than I do about people who turn their javascript off. It's a non-issue for us. If they have javascript turned off at such an aggressive level then they are used to things being broken all over the place.


I think it is not about web apps, more about the casual "one time visit" browsing. There is usually zero incentive to enable JS for a random website where you just want to read an article or watch images. JS there is usually (my impression) used for things that distract or try to lure me into other content. I like distraction free consumption. There is a high incentive to enable JS for a dynamic web app or site you regularly visit (eg HN) though.

Good control for JS and other requests in browsers makes this quite convenient to control as a (expert) user. I like umatrix a lot.


When I'm developing a website, if I have a time to do things the right way, I'm doing my best to present to user a usable website both without CSS and JS. Good semantic markup usually provides good user interface. And, as crawlers usually don't like JS, making non-JS version helps them to crawl the site anyway.


I browse with NoScript. Most sites are usable without JavaScript, maybe not as good looking but who cares. A few sites are not usable and I don't like them. If I really care about their content I enable some of their scripts. After a little, one gets good at recognizing the obvious candidates to enable and the obvious tracking scripts to keep disabled. This and that if I want to see videos (maybe I don't want), this and that if I want to hear audio, not that because it's the ad script, etc. If I don't really care, I go somewhere else. The same content is usually available elsewhere.


What would be better is the ability to selectively enable javascript scripts. I want the javascript for your webpage to run normally, but I don't want the javascript for that ad that moves around the screen or the one for that tracker bug to run at all. This was much easier when ads where mostly in flash.


Just FYI, Firefox has removed the setting to disable JavaScript out of its preferences dialog a while ago. Now the only way to disable it is through about:config. This means that to a second or third approximation, 100% of Firefox users will have JS enabled. I expect other browsers to follow this as well. JS is now a standard part of the web. Additionally, remember that JS is not required to track you. An invisible GIF works just as well to know which pages you visited.

BTW, how do you feel about apps running on your smartphone? Is that much different? Have you seen forecast.io? Well, probably not since you don't run JS, but check it out. Some people develop fantastic mobile apps in JS instead of Objective-C/Java and in that MO there is no option to render things server-side.


I install noscript on every firefox install that I can. Most people like that it makes a lot of sites run significantly faster.

I don't use a smartphone - they are a pathological platform entrenched firmly on the wrong side of the War On General Purpose Computing.

For the record: I do run some javascript - on a carefully selected whitelist basis. A big part of my point is that a website that works is something that I might whitelist to access better features. The problem is showing a broken page instead of showing a basic page and progressively enhancing in the fancier features.


I hope you don't drive a car either: it has many computers which are even on the wronger side of the War On General Purpose Computing. How do people even come up with idea, that anything with CPU inside should be able to do General Purpose Computing :(


Sending incomplete pages means those of us that don't run javascript[1] don't actually see your site.

Sure, and that's a choice that every website owner has to make. Building a working no-JS app is non-trivial for all but the simplest things. Increasingly businesses I've worked with have found that "Doesn't allow Javascript" is shorthand for "Won't buy things online or share useful data due to security worries" so they're paying less and less attention to your needs. Things I build fall back to a simple no-JS version that prompts the user to phone orders or turn on JS. I would expect that to become the norm over the next 2 or 3 years.


Unfortunately, you are in the vast minority of people, probably even on hacker news (I would guess less that 1% of overall population [1]).

So the decision has to be made, just like whether you want to support IE7 users, whether or not you want to put in the extra time to support those edge cases.

And as always, it depends on the type of site you are running. If it's Amazon, that 1% matters a shit ton. If it's a side project or SaaS startup for example, it makes sense to hold off on supporting those 1% in favor of more pressing features.

1.https://gds.blog.gov.uk/2013/10/21/how-many-people-are-missi...


>Serving up only a template (or worse: an empty body tag) is insane.

Not, it's rational if your serving a web application instead of a web document. Most pages written with AngularJS are web applications.


You are almost alone dude. Javascript enabled users are the norm. People don't care about "Running arbitrary instructions in a Turing complete language" they care about buying things and watch videos faster.


I think it's reasonable to assume that either JavaScript is available or the user has made the conscious decision to turn off JS; if so, that user is probably savvy enough to realize the issue and decide whether to enable JS for the particular site.

Far less reasonable is faulting a JS framework for assuming it can use JS.

Still, I agree that if JS isn't essential to the functionality of the site, a non-JS fallback should be available.


> Google shouldn't get to build a log of every page we visit.

Why not block Google at your router?


    $ cat /etc/hosts | grep google-analytics
    0.0.0.0 google-analytics.com
    0.0.0.0 www.google-analytics.com
    0.0.0.0 ssl.google-analytics.com
Unfortunately, the problem is dynamic; any blacklist is always going to be outdated. A whitelist approach is the only blocking method that works. Javascript spyware has gotten a lot worse in the last ~year. A mainstream news site I happened to test recently wanted to issue HTTP requests to no less than 34 unique hosts, just to render a typical static news article. That wasn't the ads (adblock edge).


I think you shoul differ between web applications and web sites. It is nice when content sites work without js, application really doesn't have to.


Not at all. Even when you are serving up an "application", sending a broken page isn't a good idea. I've written a couple very heavyweight "web apps" myself (for in-house use) and even those are careful to always send full pages even though page-rewriting was used most of the time. Given that this was easy to do in rails, I fail to see why rendering the template on the server during the first request is hard. It's pure lazyness by these newer frameworks.

If "app" means some pure-javascript game or similar, the very least you can do is provide a proper page that indicates that the game requires javascript. A warning message conveys useful information - a borken template or empty body tag conveys "bugged website".


People who intentionally disable javascript in the year 2015 should also disable css, then close their browsers, then remove themselves from the network, then turn off their computers, then go live in cave.


> If the user views 100 pages

Most sites are not going to have users viewing 100 pages within a single session / before the cache expires. If you can reuse a template that many times then SPAs make sense certainly. But for many sites (especially content focused ones) the math is harder than that.


Enterprise developer here (well mostly). I see this in most of our apps -- the same "template" is called hundreds of times in a session. An accountant auditing invoices is a perfect example.

Now for random public website? Depends on the content. In the blogs I've administered, I feel that the first page rendering speed is KEY to keeping traffic (ie static cache the html). However, new visitors, when given a quick way to get to new articles with a minimal amount of friction, tend to stay on the site longer.


I don't know, it still feels alien to get a response from the server w/ an incomplete page and then wait for it to be completely rendered. https://groups.google.com is s a good example of this.


I wouldn't be at all surprised if the loading problems with Google Groups are a legacy of Deja News code still floating about in there somewhere. There's no good reason why a single page app has to look horrible while it's loading data - a well written app should display a nice looking holding page that gracefully adds content as it's received over the wire.


The previous (post-Deja) iteration of Google Groups was much better than the current one, which makes me think it isn't a Deja-inherited issue.


I agree with you. I hate hate hate having every single operation on a website take 1-2 seconds in good conditions, and 5-6 seconds on average conditions. I've developed a habit of tab clicking every hyperlink I think I might need just to navigate a little faster. Having the browser act like an application and then that application query data from the server fixes the performance problem... well at the least makes the performance problem fixable.

The problem is, Just because the client should download the application to work with your server, does NOT mean that the application has to be written in an insanely complicated framework with a year long spin up time and constantly changing semantics that makes it impossible to debug anything and obscures the performance of your application. Those two things are different.


I'm sure rendering on the server makes sense in some cases, but it's not as simple as you think. If you have a million users looking at the same content, you can render the template once on the server or a million times on the clients. Obviously, that's not always the case, but with things like ESI [1] the content does not have to be exactly the same to take advantage of caching (rendering the template once).

Also, keep in mind that content is compressed so the actual transfer time is often negligible, and the JS libraries could be bigger than the total html for the session.

[1] http://en.wikipedia.org/wiki/Edge_Side_Includes


Then what do you think about Twitter switching back to server-side rendering? [1]

[1] https://blog.twitter.com/2012/improving-performance-on-twitt...


That kind of makes sense. How often do you view Tweets on a desktop? Even when I'm actively using a desktop browser, I usually check Tweets on my phone. Why? Because that's where I get notified first. Fits the "underpowered devices" category, right?


I suppose the focus should be on who your users are, and what kind of devices they have, to help one decide between client-side and server-side rendering.

Reminds me of this article by Joel Spolsky: http://www.joelonsoftware.com/articles/FiveWorlds.html


The advantage of server side rendering is only on first load of the page. Once everything is cached in the client, client side rendering is way more efficient. You only need to fetch the data you need. And if you have serverside push, all the better.


Did you ever compare gzipped JSON that you are sending vs gzipped rendered HTML using the same JSON blob?

You will be surprised that difference between the two will be less than 1%.


Another thing people tend to overlook about server-side rendering tends to be the battery usage on mobile involved in keeping radios alive for things that would otherwise not require it. Mobile OS's go through pains to keep battery performant by turning off hardware when not in use, but constant server-side view round-trips de-optimize that capability.


You still have a roundtrip. The question is can angular use significantly reduce the bandwidth required. In my, limited, experience of any major js rendering on the client the answer is most likely no it won't.


And yet we are able to scale browser rendering performance just by sending plain old HTML + CSS.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact