[ Site Index ] [ Rant Index ] [ Feedback ]
Author's note: This essay was originally commissioned by Alex Steffen for the projected 111st issue of Whole Earth Review, which was to focus on the Singularity. Sadly, WER effectively ceased publication with issue 110, and (the shorter, WER-edited version of) this article is not among the content you can find on their web site. I'm therefore releasing this draft.
I originally wrote this in early 2002. I have not updated the content significantly -- I think it provides a useful historical context -- but have checked and, where necessary, modified the URLs. Where I have made additions to the text, they are noted.
The 18th century utopian philosopher Jeremy Bentham's panopticon was a prison; a circle of cells with windows facing inwards, towards a tower, wherein jailers could look out and inspect the prisoners at any time, unseen by their subjects.
Though originally proposed as a humane experiment in penal reform in 1785, Bentham's idea has eerie resonances today. One of the risks of the technologies that may give rise to a singularity is that they may also permit the construction of a Panopticon society -- a police state characterised by omniscient surveillance and mechanical law enforcement.
Note that I am not using the term "panopticon singularity" in the same sense as Vinge's Singularity (which describes the emergence of strongly superhuman intelligence through either artificial intelligence breakthroughs or progress in augmenting human intelligence), but in a new sense: the emergence of a situation in which human behaviour is deterministically governed by processes outside human control. (To give an example: currently it is illegal to smoke cannabis, but many people do so. After a panopticon singularity, it will not only be illegal but impossible.) The development of a panopticon singularity does not preclude the development of a Vingean singularity; indeed, one may potentiate (or suppress) the other. I would also like to note that the idea has been discussed in fictional form by Vinge. |
Moore's Law states that the price of integrated circuitry falls exponentially over time. The tools of surveillance today are based on integrated circuits: unlike the grim secret policemen of the 20th century's totalitarian regimes they're getting cheaper, so that an intelligence agency with a fixed budget can hope to expand the breadth of its surveillance rapidly. In the wake of the events of September 11th, 2001, the inevitable calls for something to be done have segued into criticism of the west's intelligence apparatus: and like all bureaucratic agencies, their response to a failure is to redouble their efforts in the same direction as before. (If at first you don't succeed, try harder.)
It is worth noting that while the effectiveness of human-based surveillance organizations is dependent on the number of people involved -- and indeed may grow more slowly than the work force, due to the overheads of coordinating and administering the organization -- systems of mechanised surveillance may well increase in efficiency as a power function of the number of deployed monitoring points. (For example: if you attempt to monitor a single email server, you can only sample the traffic from those users whose correspondence flows through it, but if you can monitor the mail servers of the largest ISPs you can monitor virtually everything without needing to monitor all the email client systems. Almost all traffic flows between two mail servers, and most traffic flows through just a few major ISPs at some point.) Moreover, it may be possible to expand an automated surveillance network indefinitely by simply adding machines, whereas it is difficult to expand a human organization beyond a certain point without having knock-on effects on the macroeconomic scale (e.g. by sucking up a significant proportion of the labour force).
Here's a shopping-list of ten technologies for the police state of the next decade, and estimates of when they'll be available. Of necessity, the emphasis is on the UK -- but it could happen where you live, too: and the prognosis for the next twenty years is much scarier.
Availability: today.
The UK leads the world in closed circuit surveillance of public places, with over two [2004: four] million cameras watching sixty million people. Cameras are cheaper than cops, and act as a force multiplier, letting one officer watch dozens of locations. They can see in the dark, too. But today's cameras are limited. The panopticon state will want cheaper cameras: powered by solar panels and networked using high-bandwidth wireless technology so that they can be installed easily, small so that they're unobtrusive, and equipped with on-board image analysis software. A pilot study in the London borough of Lambeth is already using face recognition software running on computers monitoring the camera network to alert officers when known troublemakers appear on the streets. Tomorrow's smart cameras will ignore boring scenes and focus on locations where suspicious activities are occuring.
(Experience suggests that cameras don't reduce crime -- they just move it to places where there's no surveillance, or displace it into types of crime that aren't readily visible. So the logical response of the crime-fighting bureaucracy is to install more cameras ...)
Availability: 1-5 years.
Today's camera networks are hard-wired and static. But cameras and wireless technology are already converging in the shape of smartphones. Soon, surveillance cameras will take on much of the monitoring tasks that today require Police control centres: using gait analysis and face recognition to pick up suspects, handing off surveillance between cameras as suspects move around, using other cameras as wireless routers to avoid network congestion and dead zones. The ability to tap into home webcams, private security cameras, and Neighbourhood Watch schemes will extend coverage out of public spaces and into the private realm. Many British cities already require retail establishments to install CCTV: the Regulation of Investigatory Powers Act (2001) gives the Police the right to demand access to electronic data -- including camera feeds. Ultimately the panopticon society needs cameras to be as common as street lights.
(Looking on the bright side: London Transport is experimenting with smart cameras that can identify potential suicides on underground train platforms by their movement patterns, which differ from those of commuters. So p2p surveillance cameras will help the trains run on time ...)
Availability: now to 5 years.
Ever since the first slow-motion film footage, it's been clear that people and animals move their limbs in unique ways -- ways that depend on the relative dimensions of the underlying bone structure. Computer recognition of human faces has proven to be difficult and unreliable, and it's prone to disguise: it's much harder to change the length of your legs or the way you walk.
Researchers at Imperial College, London, and elsewhere have been working on using gait analysis as a tool for remote biometric identification of individuals, by deriving a unique gait signature from video footage of their movement.
(When gait analysis collides with ubiquitous peer-to-peer smart cameras, expect bank robbers to start wearing long skirts.)
Availability: 2-8 years.
Very short wavelength radio waves can be tuned to penetrate some solid and semi-solid surfaces (such as clothing or drywall), and return much higher resolution images than conventional radar. A lot of work is going into domesticating this frequency range, with funding by NIST focussing in particular on developing lightweight short-range radar systems. Terahertz radar can pick up concealed hard objects -- such as a gun or a knife worn under outer clothing -- at a range of several metres; when it arrives, it'll provide the panopticon society's enforcers with something close to Superman's X-ray vision.
(If they can see through walls, why bother with a search warrant?)
Availability: 3-10 years.
Cellphones emit microwave radiation at similar wavelengths to radar systems. Celldar is a passive radar system that listens to the signals reflected by cellphone emitters. When a solid object passes between a transmitter and a cellphone it reduces the signal strength at a receiver.
Celldar was originally designed as a military system that would use reflected cellphone emissions to locate aircraft passing above the protected area. However, by correlating signal strength across a wide number of cellular transceivers (both base stations and phone handsets) in real time it should be possible to build up a picture of what objects are in the vicinity. Subtract the known locations of buildings, and you've got a system that can place any inhabited area under radar surveillance -- by telephone. (As Rodney King demonstrated, we can already be tracked by cellphone. Now the panopticon society can place us under radar surveillance by phone. And as phones exchange data at ever higher bandwidth, the frequencies will shorten towards the terahertz range. Nude phone calling will take on an entirely different meaning ...)
Availability: 1-5 years.
Radio Frequency ID chips are used for tagging commercial produce. Unlike today's simple anti-shoplifting tags in books and CD's, the next generation will be cheap (costing one or two cents each), tiny (sand-grain sized), and smart enough to uniquely identify any individual manufactured product, by serial number as well as type and vendor. They can be embedded in plastic, wood, food, or fabric, and by remotely interrogating the RFID chips in your clothing or posessions the panopticon society's agencies can tell a lot about you -- like, what you're reading, what you just ate, and maybe where you've been if they get cheap enough to scatter like dust. More insidiously, because each copy of a manufactured item will be uniquely identifiable, they'll be able to tell not only what you're reading, but where you bought it. RFID chips are injectable, too, so you won't be able to misplace your identity by accident.
(And if the panopticon police don't like the books you're reading or the DVDs you're watching, maybe they can use your tag fingerprint to order up a new you?)
Availability: now-5 years.
Trusted Computing doesn't mean computers you can trust: it means computers that intellectual property corporations can trust. Microsoft's Palladium software (due in a future Windows release [2004: due in Windows Longhorn, renamed to NGSCB]) and Intel's TPCA architecture are both components of a trusted computing platform. The purpose of trusted computing is to enforce Digital Rights Management -- that is, to allow information providers to control what you do with the information, not to protect your rights.
Disney will be able to sell you DVDs that will decrypt and run on a Palladium platform, but which you won't be able to copy. Microsoft will be able to lease you software that stops working if you forget to pay the rental. Want to cut and paste a paragraph from your physics text book into that essay you're writing? DRM enforced by TCPA will prevent you (and snitch to the publisher's copyright lawyers). Essentially, TPCA will install a secret policeman into every microprocessor. PCs stop being general purpose machines and turn into Windows on the panopticon state. It's not about mere legal copyright protection; as Professor Lawrence Lessig points out, the rights that software and media companies want to reserve go far beyond their legal rights under copyright law.
If the trusted computing folks get their way, to ensure control they'll need to pass legislation to outlaw alternative media. Jaron Lanier predicts that today's microphones, speakers and camcorders could become contraband; and in case this sounds outlandish and paranoid, the US senate has seen more than one bill, (most prominent among them, the Consumer Broadband and Digital Television Promotion Act) that would require DRM interlocks in all analog-to-digital conversion electronics in order to prevent illicit copying.
(Presumably he wasn't thinking of aircraft instrumentation, cardiac monitors, or machine tools at the time, but under the proposed law they would need copy-prevention interlocks as well ... )
Availability: now-10 years.
Radio waves can travel through one another without interacting. Radio 'interference' happens when radio transceivers use dumb encoding schemes that don't let multiple channels share the same wavelength: interference is a side-effect of poor design, not a fundamental limit on wireless communications.
With fast microprocessors it's possible to decode any radio-frequency signal on the fly in software, by performing Fourier analysis on the raw signal rather than by using hard-wired circuitry. Software radios can be reconfigured on the fly to use new encoding schemes or frequencies. Some such encoding schemes work to avoid interference; so-called cognitive radio transcievers take account of other transmitters in the neighbourhood and negotiate with them to allocate each system a free frequency. (The 802.11 wireless networking protocols are one early example of this in action.) SR doesn't sound like a tool of the panopticon society until you put them together with celldar and TCPA. Cellphones and computers are on a collision course. If the PC becomes a phone, and every computer comes with a built-in secret policeman _and_ can be configured in software, the panopticon's power becomes enormous: remote interrogation of RFID dust in your vicinity will let the authorities know who you're associating with, reconfiguration of phones into celldar receivers will let them see what you're doing, and plain old-fashioned bugging will let them listen in. If they can be bothered.
(Invest in tinfoil hat manufacturers; it's the future of headgear!)
Availability: now-5 years.
Microtechnology, unlike nanotechnology, is here today. By building motors, gears, pumps, and instruments onto silicon wafers using the same lithographic techniques that are used for making microcircuitry, engineers are making it possible to build extremely small -- and cheap -- analytical laboratories. Devices under development include gas chromatography analysers, mass spectroscopes, flow cytometers, and a portable DNA analyser small enough to fit in a briefcase. The panopticon society is lavish with its technologies: what today would occupy a Police department's forensic lab, will tomorrow fit into a box the size of a palmtop computer.
(And they won't have to send that urine sample to a lab in order to work out that you were in the same room as somebody who smoked a joint two weeks ago.)
Availability: -5 years to +10 years
Total Information Awareness. Department of Homeland Security. NSA. ECHELON. This article was emailed to Whole Earth Review's staff; by including these keywords it almost certainly caught the attention of ECHELON, the data mining operation run by the NSA and its associated intelligence agencies. ECHELON has monitored all internet, telephone, fax, telex, and radio traffic for years, hoovering up the data. But analysing electronic intelligence is like trying to drink water from a firehose; the problem is identifying relevant information, because for every Al Qaida operative discussing the next bomb plot, a million internet denizens are speculating and gossiping about the same topic. And if the infoglut seems bad now, wait until your every walk down the high street generates megabytes of tracking data. The Department of Homeland Security is just one of the most obvious agencies trying to tackle the information surplus generated by the embryonic panopticon society. The techniques they propose to use entail linking up access to a variety of public and private databases, from credit rating agencies and the INS to library lending records, ISP email and web server logs, and anything else they can get their hands on. The idea is to spot terrorists and wrongdoers pre-emptively by detecting patterns of suspicious behaviour.
The trouble is, data mining by cross-linking databases can generate false inferences. Imagine your HMO with access to your web browsing records. Your sister asks you to find her some books about living with AIDS, to pass on to a friend; you go look on Amazon.com, researching the topic, and all the HMO knows is that you're looking for help on living with AIDS. And how does the Department of Homeland Society know whether I'm planning a terrorist act ... or doing my research before writing a novel about a terrorist incident? To make matters worse, many databases contain corrupt information, either by accident or malice. The more combinations of possible corrupt data you scan, the more errors creep into your analysis. But to combat these problems, the Office of Information Awareness is proposing to develop new analytical techniques that track connections between people -- where they shop, how they travel, who they know -- in the hope that if they throw enough data at the problem the errors will go away.
(Guess they think they need the panopticon surveillance system, then. After all, if data mining never worked in the past, obviously you can make it work by throwing more data at it ...)
The pressure to adopt these technologies springs from our existing political discourse as we struggle to confront ill-defined threats. We live in a dangerous world: widespread use of high technology means that individuals can take actions that are disruptive out of all proportion to their numbers. Human nature being what it is, we want to be safe: the promise of a high-tech surveillance "fix" that will identify terrorists or malefactors before they hurt us is a great lure.
But acts of mass terror exist at one end of a scale that begins with the parking ticket, the taping of a CD for personal use in a Walkman, a possibly-defamatory statement about a colleague sent in private email to a friend, a mistakenly ommitted cash receipt when compiling the annual tax return ... the list is endless, and to a police authority with absolute knowledge and a robotic compulsion to Enforce The Law, we would all, ultimately, be found guilty of something.
This brings up a first major point: legislators do not pass laws in the expectation that everybody who violates them will automatically be caught and punished. Rather, they often pass new laws in order to send a message -- to their voters (that they're doing something about their concerns) and to the criminals (that if caught they will be dealt with harshly). There is a well-known presumption that criminals are acting rationally (in the economic sense) and their behaviour is influenced by the perceived reward for a successful crime, and both the risk and severity of punishment. This theory is implicitly taken into account by legislators when they draft legislation, because in our current state of affairs most crimes go undetected and unreported. A panopticon singularity would completely invalidate these assumptions.
Furthermore: many old laws are retained despite widespread unpopularity, because a vocal minority support them. An estimated 30 percent of the British population have smoked cannabis, currently an offense carrying a maximum penalty of 6 months' imprisonment (despite rumours of its decriminalization), and an absolute majority of under-50's supports decriminalization, but advocating a "soft on drugs" line was perceived as political suicide until very recently because roughly 25% of the population were strongly opposed.
Some old laws, which may not match current social norms, are retained because it is easier to ignore them than to repeal them. In Massachusetts, the crime of fornication -- any sex act with someone you're not married to -- carries a 3 month prison sentence. Many towns, states, and countries have archaic laws still on the books that dictate what people must wear, how they must behave, and things they must do -- laws which have fallen into disuse, and which are inappropriate to enforce. (There's one town in Texas where since the 19th century it has been illegal for women to wear patent leather shoes, lest a male see something unmentionable reflected in them; and in London, until 1998 all taxis were required to carry a bale of hay in case their horse needed a quick bite to eat. Diesel and petrol powered cabs included.)
These laws, and others like them, highlight the fact that with a few exceptions (mostly major felonies) our legal systems were not designed with universal enforcement in mind. But universal enforcement is exactly what we'll get if these surveillance technologies come together to produce a panopticon singularity.
A second important side-effect of panopticon surveillance is the chilling effect it exerts on otherwise lawful activities. If you believe your activities on the net are being monitored for signs of terrorist intent, would you dare do the research to write that thriller? Nobody (with any common sense) cracks a joke in the waiting line at airport security -- we're all afraid of attracting the unwelcome attention of people in uniform with no sense of humour whatsoever. Now imagine the straitjacket policing of aviation security extended into every aspect of daily life, with unblinking and remorseless surveillance of everything you do and say. Worse: imagine that the enforcers are machines, tireless and efficient and incapable of turning a blind eye.
Surveillance need not even stop at our skin; the ability to monitor our speech and track our biological signs (for example: pulse, pupillary dilation, or possibly hormone and neurotransmitter levels) may lead to attempts to monitor thoughts as well as deeds. What starts with attempts to identify paedophile predators before they strike may end with discrimination against people believed to be at risk of "addictive behaviour" -- howsoever that might be defined -- or of harbouring anti-social attitudes.
We are all criminals, if you dig far enough: we've broken the speed limit, forgotten to file official papers in time, made false statements (often because we misremembered some fact), failed to pay for services, and so on. These are minor offenses -- relatively few of us are deliberate criminals. But even if we aren't active felons we are all potential criminals, and a case can be -- and is being -- made for keeping us all under surveillance, all the time.
A Panopticon Singularity is the logical outcome if the burgeoning technologies of the singularity are funneled into automating law enforcement. Previous police states were limited by manpower, but the panopticon singularity substitutes technology, and ultimately replaces human conscience with a brilliant but merciless prosthesis.
If a panopticon singularity emerges, you'd be well advised to stay away from Massachusetts if you and your partner aren't married. Don't think about smoking a joint unless you want to see the inside of one of the labour camps where over 50% of the population sooner or later go. Don't jaywalk, chew gum in public, smoke, exceed the speed limit, stand in front of fire exit routes, or wear clothing that violates the city dress code (passed on the nod in 1892, and never repealed because everybody knew nobody would enforce it and it would take up valuable legislative time). You won't be able to watch those old DVD's of 'Friends' you copied during the naughty oughties because if you stick them in your player it'll call the copyright police on you. You'd better not spend too much time at the bar, or your insurance premiums will rocket and your boss might ask you to undergo therapy. You might be able to read a library book or play a round of a computer game, but your computer will be counting the words you read and monitoring your pulse so that it can bill you for the excitement it has delivered.
And don't think you can escape by going and living in a log cabin in the middle of nowhere. It is in the nature of every police state that the most heinous offense of all is attempting to escape from it. And after all, if you're innocent, why are you trying to hide?
[ Site Index ] [ Rant Index ] [ Feedback ]