As a web developer, one is often confronted with the dilemma of how to handle users without JavaScript. Should I just ignore them? That depends. Should I stop using JavaScript completely? Of course not. Should I say 'screw you' to users without JS? Certainly not!

I've seen many websites which, when visited without JavaScript enabled, will confront you with an error page with messages which practically insult the user such as:
"Come back when you have JavaScript enabled - It's 2014!" or "Try installing a modern browser."
Often times these are sites that could easily still be functional without any scripts. JavaScript was created as a tool to enhance the browsing experience by adding additional behavioral and interactive features. When did we get to the point where lack of this additional functionality meant the entire process should fail? That's like your car refusing to start because you disabled the radio and the air conditioner.

So should I care about users who have disabled JavaScript in their browsers? The answer to this question is not a simple one. To address this question we should begin by answering the following questions:

  • What is a user?
  • Why do some users choose to disable JavaScript?
  • Why might JavaScript be unavailable even to users who have not disabled it?
  • How should a web page be designed to accommodate all users?

What is a user?

This may seem like a stupid question, but it's fundamental if one wants to fully understand web design and development. Most developers and designers would probably agree with the definition of 'user' I found in the Oxford dictionary and on Wikipedia: "a person who uses or operates something", but I prefer Marriam-Webster's definition: "a person or thing that uses something". Note the key difference between the two definitions is the inclusion of 'thing' in the list of what can be a 'user'.

People visiting websites are typically the only users we ever care about when designing a website, but we're missing out on a very important user, namely crawlers. A crawler is a user too! We can't guarantee that crawlers will interpret JavaScript at all, and yet we very often design pages with navigation and links which are handled solely through JavaScript and AJAX calls. Even worse I've seen many sites which flat out refuse to serve content to users with JavaScript disabled. Not very good user experience for a crawler.

Why do some users choose to disable JavaScript?

According to a study from Yahoo, somewhere between one and two percent of users do not have JavaScript enabled. While this may seem like a very small percentage, if you have a high-traffic site it can still be significant. Putting crawlers (users who do not view websites in browsers) aside for the time being, there are many reasons why some users might choose to disable JavaScript.

Performance:
Fancy JavaScript-enhanced visuals and frequent AJAX requests can have a major impact on the speed the page loads and the number of requests made. A mobile user may want to limit the amount of bandwidth used and the number of requests made due to slow connections and bandwidth limits. On my phone, JavaScript and image loading were both disabled by default! With advancements of computing capabilities of modern computers we began to outsource more of the processing to the users to reduce load on our servers, but with the rise of smartphones we have to keep in mind that many of our users could be using devices with very weak capabilities and poor internet speeds.

Usability and Accessibility:
Some people say that the lack of JavaScript support for screen readers (i.e. devices which allow the blind to browse the web) is a common reason for users to not have JavaScript enabled. This however I find to be a myth based on studies which have been conducted on the topic. Nevertheless, it should be considered as a possibility.

Along with usability also comes the fact that many people simply dislike JavaScript or rather the ways it gets used including the loading of ads, popups, moving and flashing content, etc. Because of these annoyances, some users selectively enable scripts to support only the particular features they want.

Security:
JavaScript can be very dangerous. It opens up possibilities for XSS, CSRF, and session hijacking vulnerabilities first-of-all. Additionally adware combines well with scripts to track your activities. JavaScript can be used to gather saved form data (e.g. credit card numbers, emails, addresses, etc.). Most users who have addons such as NoScript disable scripts globally and selectively allow scripts only from trusted sites.

Why might JavaScript be unavailable even to users who have not disabled it?

Whether or not you're willing to acknowledge any validity in reasons for choosing to disable scripts, some users simply do not receive them, or they are not executed for them even though they have scripts enabled.

Consider this simple code:

alert(1);
if(x[3] == 1){ alert(3); }
alert(2);

The variable x does not exist yet we try to access it which causes an error that breaks this script. The alert(2) will not be executed. When your site has a lot of very complicated scripting it can happen that you miss something or that a user encounters an error that was not present for your browser. These errors can often prevent scripts from executing which can render the site unusable if there is no fallback implemented.

Now consider another possibility. Here is a simple experiment:

test.html:

<script src='http://localhost/test.js'></script>

test.js:

alert(1);

Now when visiting http://localhost/test.html, you get an alert box as expected. Now try https://localhost/test.html (note https now). This time nothing happens. If you look at your console you should see something like:

Blocked loading mixed active content "http://localhost/test.js"


or

[blocked] The page at 'https://localhost/test.html' was loaded over HTTPS, but ran insecure content from 'http://localhost/test.js': this content should also be loaded over HTTPS.

This is because of a relatively new security feature (since FireFox 23) to prevent the loading of non-https content from https pages. This is, of course, easily repaired by not specifying the protocol when including the script, however this change to the default settings by browser-makers happened suddenly enough that many websites had problems due to the change. We can't count on the environment in which users view our pages to stay consistent.

In addition to these possibilities, there also exists the possibility that scripts can be blocked by ad-blocking browser addons. It can happen that, due to improper checking or changes to the blocking filters from updates, scripts can be detected as ads and be prevented from loading.

How should a web page be designed to accommodate all users?

Now that we have some understanding for who our users are and why they might not be reaping the benefits of JavaScript, now we need to decide what to do about it. When you search online for information about how to handle noscript users you find a lot of misguided comments such as:

It's 2013. If someone has js disabled, they know their web experience is going to suck. Not just on your site, but everywhere. It's less than 1% of users, so don't worry about it. You can/should basically assume js is enabled.

It makes me feel old to say it, but I don't remember the Internet sucking back before we relied so heavily on JavaScript; why does it have to now?

I think sacrificing functionality for 99% of users to accommodate 1% is sheer bloody mindedness.

Fortunately we don't have to. We can design our sites using a technique called Progressive Enhancement. Progressive Enhancement is a strategy for web design which uses "web technologies in a layered fashion that allows everyone to access the basic content and functionality of a web page, using any browser or Internet connection, while also providing an enhanced version of the page to those with more advanced browser software or greater bandwidth". Basically you should design your website first so that it provides all fundamental functionality and then add layers on top of that to enhance the experience (such as AJAX requests and overlays).

For example, navigation buttons should have links behind them in the markup and then you can add JavaScript logic above that to change the default behavior to use AJAX requests instead. This way any user can navigate the page regardless of disabled scripts or errors. This improves not only your user experience but additionally your SEO (the side effect of improved UX for crawlers).

But remember, at the beginning of this article I answered "Should I just ignore them?" with "That depends". On what? If your page is, for example, a news website then you should not ignore those users. If your page is an online game then JavaScript might be a core tool necessary in order for the application to function at all. In this second case you have no choice but to require JavaScript. But that doesn't mean you should insult your users with messages telling them to get with the 21st century. Just inform the user of the need for scripts and ask politely to enable them.

Additionally, don't abuse scripts. There's no reason to use scripts to style your page when we have CSS for just that purpose. As Abraham Maslow said, "I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail." We have a variety of tools at our disposal and for each is its purpose.

Sources:
Progressive Enhancement
Why do people disable javascript?
How many users have JavaScript disabled?
Screen Reader Survey
Image from NoScript.