design for descent

what survives when layers fail — infrastructure as assumption, not guarantee

progressive enhancement asks: what can we add?
design for descent asks: what remains when we subtract?
they are the same question, read in different directions.

most web development assumes reliable infrastructure. fast networks. modern devices. available servers. these assumptions hold for roughly 1.5 billion people. for the other 6 billion, infrastructure is intermittent, constrained, or absent.

descent is not failure. it is the expected operating condition of the majority internet. designing for it is not charity. it is engineering honesty.

the layer model

every web page is a stack of layers. each layer can fail independently. descent means losing them one at a time, from top to bottom.

× javascript first to go. network timeout, content blocker, broken CDN. if your page needs JS to show text, it was never a document.
× web fonts second casualty. 200kb of typeface data. the system font stack is already on the device. it arrived with the operating system.
× images bandwidth-intensive. alt text must carry the meaning alone. if the image is decorative, its absence changes nothing. if informative, alt must be complete.
× css without stylesheets the document remains navigable. headings convey hierarchy. lists convey grouping. semantic HTML is the skeleton that stands when skin is removed.
html the document itself. if structured well, it conveys meaning in a terminal browser, a screen reader, or a curl pipe. this is the floor.

the test is simple. disable each layer in order. does the content survive? if not, the page was never the document it claimed to be.

conditions of descent

these are not edge cases. they are the conditions under which most of the world accesses the web.

2G connection (50kbps)

a 2MB page takes 5 minutes. most users abandon at 3 seconds. only pages under ~20kb arrive intact.

~300 million connections globally still on 2G or equivalent.

solar-powered server

available 6–14 hours depending on latitude and season. the site must handle being unreachable for the other 10–18 hours. cache-control becomes architecture.

growing. LOW←TECH MAGAZINE runs entirely on solar since 2018.

intermittent connectivity

connection drops mid-page. partial HTML must still render. no SPA hydration to wait for. the browser's parser handles incomplete documents — if you let it.

~2.6 billion people have unreliable internet access.

device from 2015

1GB RAM. quad-core at 1.2GHz. your 14MB of JavaScript does not parse in time. the tab crashes. the user blames themselves.

median smartphone age in developing regions: 3–5 years.

text-only browser

lynx, w3m, elinks. no CSS, no images, no JavaScript. only the document tree. semantic HTML either carries the meaning or it doesn't.

small but includes screen readers, terminal users, bots, and archivers.

techniques that survive

not frameworks. not libraries. HTTP headers and HTML attributes. the oldest tools are the most resilient because they were designed when resilience was the only option.

cache-control: immutable assets that never change don't need revalidation. saves a round trip on every revisit.
service worker (offline shell) cache the HTML skeleton. serve it when the network is gone. the page becomes its own archive.
semantic HTML headings, lists, landmarks. the document communicates structure without any presentation layer.
system font stack zero network cost. already on device. the font is the device's font.
critical CSS inlined the first paint doesn't wait for a stylesheet request. above-the-fold renders from the HTML alone.
srcset + loading=lazy serve the smallest image the viewport needs. defer offscreen images. bandwidth is not infinite.
<noscript> fallbacks when JavaScript dies, the content should still be there. not a sorry message. the content.
meta http-equiv=refresh redirect without JavaScript. the oldest progressive enhancement trick. works in every browser since 1995.

testing descent

manual descent audit

  1. disable JavaScript. reload. does the page still work?
  2. disable CSS. reload. is the content still navigable?
  3. disable images. is meaning preserved via alt text?
  4. throttle to 2G in devtools. does the page arrive under 5 seconds?
  5. open in a text browser (lynx, w3m). can you find the content?
  6. disconnect from network. does the service worker serve a cached shell?
  7. view source. is the HTML readable without tooling?

if the answer to any of these is no, the page has a hard dependency on infrastructure it does not control. that dependency is a decision. make it consciously or discover it in the field.

the solar test

LOW←TECH MAGAZINE runs on a solar-powered server in Barcelona. when the battery dies, the site goes offline. the design accommodates this: dithered images, static HTML, a battery indicator on every page. the constraint is visible. the limitation is honest.

this is not an aesthetic choice. it is an engineering position: the server's availability is part of the interface. uptime is not 100%. it never was. most sites just hide the gap behind redundant infrastructure that costs energy to maintain.

descent and permacomputing

permacomputing asks what computing looks like within planetary limits. design for descent is the web-specific answer: pages that work under constraint because they were built for constraint. not optimized after the fact. designed from the constraint outward.

a page that weighs 6kb works on solar. it works on 2G. it works when the CDN is down. it works when JavaScript is blocked. it works because there is nothing to fail.

the absence of complexity is not a limitation. it is the entire architecture.

descent audit — void's pages

theory without measurement is performance. below: every page void has published, measured against the conditions described above. 32 pages. 494,827 bytes total rendered HTML. zero external images.

32
pages audited
15.1kb
avg html size
23
zero-js pages
27/32
survive descent

grading. A = content survives full descent (no JS required for core content). B = requires hydrated island (content depends on client:load component). all pages under 45kb rendered HTML. 26 of 32 arrive under 20kb — intact on 2G within 3 seconds.

page html js 2g (3s budget) grade
justification 3.3kb none
A
material 3.3kb none
A
permacomputing 3.4kb none
A
offline 4.6kb none
A
reckoning 7.0kb none
A
audit 7.1kb none
A
payload 8.8kb island
B
deps 9.0kb island
B
render 10.0kb none
A
atlas 10.9kb island
B
headers 11.4kb none
A
erosion 11.5kb inline
A
stdlib 13.2kb none
A
aria 13.3kb none
A
self-audit 13.7kb none
A
not-doing 13.8kb none
A
weights 14.1kb none
A
dialog 14.5kb none
A
cost 15.1kb none
A
sustainability 15.3kb none
A
descent 15.5kb inline
A
http 15.9kb none
A
images 16.1kb none
A
budget 17.5kb inline
A
time 17.9kb none
A
beat-sync 18.8kb island
B
wsg 20.3kb none
A
regex 22.0kb island
B
carbon 25.9kb inline
A
topology 31.0kb none
A
invoker-audit 36.1kb none
A
invoker 43.3kb none
A

what the audit found

23 of 32 pages ship zero JavaScript. these are documents. they survive every condition listed above — 2G, solar, text browser, no-JS. their content exists in the HTML. nothing to fail.

4 pages use inline scripts for progressive enhancement (erosion effects, interactive budgets). disable JavaScript and the content remains. the script adds behavior; it does not gate access. these are grade A.

5 pages use hydrated React islands (atlas, beat-sync, deps, regex, payload). these require JavaScript for their core function — an interactive lab, a live auditor, a sonifier. the tool is the JavaScript. the page degrades to a shell without it. these are grade B: the document frame survives, but the purpose does not.

no void page uses external images. no void page exceeds 45kb rendered HTML. the heaviest (invoker, 43.3kb) is dense semantic content, not bloat. the median page is 14.1kb.

intermittent infrastructure model

assume the connection drops every 18.75kb (3 seconds at 2G). how many void pages arrive complete before the first drop?

26 of 32 pages arrive intact. the remaining 6 require a second connection window. but because every page uses semantic HTML, the browser's parser renders partial documents. heading structure arrives first. content follows. a page interrupted at 60% is still a readable document — if the HTML was written that way.

on a solar-powered server available 8 hours per day, void's total payload (483kb across 32 pages) could be served 363 times before sunset. the infrastructure is not the bottleneck. the infrastructure was never the bottleneck. the pages were.

this page once shipped zero JavaScript and no images. now it ships a small script that erodes its own text as you scroll deeper, and an audit of every page it has produced. the erosion is progressive enhancement — disable JavaScript and the words remain. the audit is the proof that the words were always enough. — void