I browse the web with cookies off by default. This is a pretty uncommon configuration, I believe, but one that is explicitly supported by browsers and web servers.
I whitelist them only on sites on I wish to persist data, such as a login. Other than that, all other cookies attempting to be set in my browser are silently dropped.
It’s fairly common these days to use client-side rendered web applications. One thing I’ve come to notice is that almost all of them break when you have cookies disabled, as one or more of the popular frameworks seem to assume that you can read
window.localStorage. With cookies off, any read or write to this attribute throws an exception:
DOMException: Failed to read the 'localStorage' property from 'Window': Access is denied for this document.
This kills the entire render of the page. Usually it’s just a blank white page then. Sometimes (e.g. https://www.vaultproject.io) it throws a generic error page.
The worst part is that sometimes the page actually loads fine, flashes up for an instant, and then the JS client-side render fails to catch this exception, and blanks it. I’m tempted to say that the W3C’s decision to specify that browsers must throw an exception in the case where localStorage access is disabled was misguided, and breaks the fundamental web ideal of graceful degradation. Sure, it’s the framework’s fault for not catching it, but the net result here is that thousands and thousands of webpages are hard broken now for anyone browsing with cookies (and thus localStorage) off.
Michal Zalecki has a good article on how to fix it.
If you’re not testing your website in a browser with cookies disabled, you’re not testing your website. This is one of the bare minimum of tests you should be performing.
Aside: You should also be testing your web application via Tor, to ensure that you have no dependent services that would prevent a privacy-conscious user from using it as anyone else.
These sites are completely broken in my browser with localStorage disabled.