After years of high-profile failures, breathless presentations from futurologist grifters at junket conferences, and the odd success story from an early adopter, ‘smart cities’ are ready for prime time. Recent advances in AI, alongside continually falling prices for computing power, are making a case for remote monitoring and automation too compelling for city managers to ignore. The coming years will see the number of monitoring devices grow a thousandfold or more while our capacity to process this data to generate meaningful information grows apace.
Cameras and other recording devices will appear throughout public spaces. Their data streams will be analysed to monitor traffic, detect crimes, track endangered birds, and almost anything else councils might be interested in. These data will be used to inform strategy, schedule maintenance, dynamically control traffic signals to manage flow, and ultimately be interwoven into every aspect of public sector delivery.
These changes will create new capabilities for the public service, potentially enabling enormous growth in public and private sector value. As these systems continue to be designed and deployed, we need to start thinking about how we can deliver better services, protect privacy, and avoid a collapse in human-centric design. Finally, the power of the state will require new constraints and counterbalancing to avoid creating the conditions for a surveillance state beyond what even Orwell could dream of.
Proprietary Platforms
Determining how city data is collected and stored (and who collects and owns it) is the first obstacle to navigate. City managers, unfamiliar with the details of data engineering, naturally lean towards ‘off the shelf’ solutions that don’t require complex integration by an overstretched in-house IT team. Companies are more than happy to oblige and provide a solution hosted on their (labouriously accessed) proprietary platform, thereby creating a monopoly for themselves of any future extensions of the network. Some contracts even allow for the data to remain in the ownership of the contractor (with licenced access), essentially subsidising the collection and sale of public data.
The situation here echoes that of the enterprise software trade. Given the extremely high costs of developing a new package, suppliers are incentivised to overpromise in order to secure the initial contract and then underdeliver and/or charge exorbitant rates for additional features. You’ve very little leverage to demand value for money once you’re locked into using a particular platform.
Ironically, it’s competition from Microsoft and Amazon, the kings of enterprise software and platforms, that is likely to break this dynamic. Both companies are aggressively moving into the space, seeking to be the cloud provider of choice for the public sector. While security companies looking to get into the game want to lock you into having to buy their expensive hardware, cloud providers are seeking to commoditise their complements by providing platforms that make hardware interchangeable and therefore comparatively cheap.
Using a cloud platform provides considerable advantages. Multiple data sets can be generated from a single device (e.g. image recognition software running in the cloud can detect an arbitrary number of objects from a single video stream). Having these datasets on a single platform also allows them to be combined in unique ways to generate novel insights (e.g. combining weather, economic, and pedestrian data to learn about spending patterns in a commercial district).
Data stored on the cloud can also be easily made open-source (with some privacy constraints) to allow public access. One can imagine individuals using these public data streams to inform their lives, dodging traffic on their commute, or deciding on a quiet playground to take their kids to. New businesses will fill the currently unfilled economic niches revealed by these data sets, finding new ways to add value to people’s lives. A new wave of scientific research will be enabled, and we’ll make new discoveries across and between fields like public health, sociology, economics, and ecology.
Jurisprudence and the Surveillance State
Any sound that Winston made, above the level of a very low whisper, would be picked up… so long as he remained within the field of vision… he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork.
George Orwell, ‘1984’.
Like this, but with AI. You will be watched at every given moment.
Since the 2013 Snowden revelations, we’ve known that the ‘Five Eyes’ state intelligence network is monitoring the vast majority of online communication. Phones and other personal computing devices are similarly compromised. Top government officials are known to institute practices like placing their phones inside a soundproof box before entering important meeting rooms.
The status quo sits in an uncomfortable equilibrium. The state has granted itself enormous power, but revelations of these powers have spurned public reaction to force limitations on this power through the likes of FISA, the Privacy Act, etc. Despite these limitations, illegal spying on domestic citizens is not uncommon. However, the inability to use this information in court has thus far limited the impact of these practices on individual liberty.
As I talked about last week, states are also constrained by the legibility of the information they collect. Mountains of data may be collected, but there is only so much that keyword and other relatively simple algorithms can do to parse and sort it. A human is still needed to analyse the results, propose action, etc. Thus, surveillance remains focused on key targets due to human resource limitations and apparently still struggles to take advantage of opportunities in a timely fashion even there.
Western States have a vast automated system [with] very powerful capabilities. [However] they are very badly aimed. You could improve the value of them by 10x or 100x in various ways. The prioritisation process is appalling. Where these capabilities are aimed is incredibly bereaucratic [and] risk adverse. Amazingly valuable material in all sorts of ways and very powerful capabilities which you could aim far better than they are aimed, but they don’t connect to a wider system for interpreting the world… A huge amount of the information that comes out of these things is printed out, old school, then incinerated. It’s not really used, it’s not really learned from.
Dominic Cummings, “Vote Leave, Brexit, Covid, and No. 10 with Boris”. Speaking with Steve Hsu on the Manifold Podcast.
This uncomfortable equilibrium is soon to become highly unstable. Technological improvements like Large Language Models (e.g. ChatGPT) will increasingly provide the automatic interpretation required to make the data legible. Meanwhile, the collection of public datasets, increasingly available and indexed on the cloud, will make existing protections around the collection and use of private data increasingly obsolete.
The Watergate surveillance scandal represented such a departure from accepted norms that a president was forced to resign, its name became synonymous with corruption, and sweeping new legal restrictions on the state were enacted.
Unfortunately, when it comes to smart cities, I can see no clear line between what we might consider acceptable and unacceptable use of this technology. After all, we already accept the automated issuance of speeding tickets and use CCTV footage as evidence in court. Why should other public datasets be any different? If a crime committed on private property can be seen and heard by a sensor in a public space, is it not still a crime?
Sensors will soon become so pervasive that virtually all activity will leave a digital fingerprint on public servers. These data sets will be analysed by AI in real time because it will become too cheap not to. Crime will become virtually impossible to totally avoid detection. What then? Do we dispatch automated enforcement drones to rigidly enforce the letter of the law? Do we actually want every minor infringement, act of teenage rebellion, and political protest to be punished? How do we manage the subjective? Perhaps discretion will be advised instead, and a dossier of compromising material built on each citizen and then selectively applied. Not exactly a formula for a free and democratic society.
The problem is that the law itself sits in an uneasy equilibrium where illegibility and law enforcement’s limited resources and discretion compensate for the rigidity of the text itself. Older laws often fall out of sync with the wishes of the collective, but decades can pass before they are eventually repealed. Many laws are expected to be variably enforced depending on who you ask and the contextual circumstances of the crime. These might be designed to discourage certain behaviours or provide recourse for the state if the individual is unwilling to keep their head down and play by certain unspoken rules. Cannabis possession laws, anti-sodomy laws, illegal immigration, and even vehicle parking norms are examples of this category and will need reform.
For better or worse, we are building a panopticon of total surveillance. Unfortunately, we cannot expect some flashpoint political scandal to provide the impetus for reform; each step towards tyranny (either of Big Brother, or the law itself) will be perfectly legal. It’s only in appreciating the whole development that we might grapple with this fact and reform our system to cope by limiting the application of this power and reinforcing the human and contextual elements of justice.
I fear for people unlucky enough to live in a country without a strong rule of law, where political dissidents will find it increasingly difficult to evade the eyes of the secret police. I worry too for people living in illiberal societies, where the coming empowerment of our puritanical impulses will leave increasingly little room for freedom of expression.
Celebrating the Human Elements of Design
As public institutions improve their digital map of their respective territories, the use of personal knowledge in decision-making will diminish in importance. The parks officer might today choose to install a particular set of playground equipment based on their subjective perceptions of that model’s value to local children. In future, we might expect such decisions to be based on each choice’s ‘joy index’ (e.g. some composite of audio volume and pedestrian count data). While we can hope this will lead to more objective decision-making in general, we cannot discount the subjective so easily. Reality contains too many critical details that are impossible to fully capture in a hard-data model. And what of our other values? We probably want our playgrounds to help build strength, provide challenge, and perhaps even incorporate quieter contemplative spaces.
It’s the perennial problem of ‘seeing like a state’. All else remaining the same, we should expect greater optimisation of that which can be measured and less optimisation of that which cannot. Compensating for this tendency will remain one of the primary tasks of policymakers.
Thankfully, these new tools and capabilities also contain the seeds of liberation. Open-sourcing public data sets will allow a more well-informed critique of public services. Imagine co-design a new facility using generative AI, where members of the public can see their ideas rendered live. Predictive models can give people unprecedentedly accurate information about future local weather and traffic conditions. LLMs can be trained on public documents to allow members of the public to query and parse public policy in unprecedented ways. Permit applications can be made predominantly self-service, shortening timeframes and reducing risk. Gone will be the days of critical details being effectively buried on page 458 of the proposal. Governance can transition from something done behind great stone walls to something done once again in the open-source agora for all interested citizens to engage in.
Another great thought-piece. What concerns me most about this is that the average person will have little if any input into determining what is acceptable or not. This will instead be led by corporations and championed by technovangelists in public sector roles who lack knowledge about impact outside their interests. Basically geeks will set the standards for everyone else, without thought for how it may impact society beyond economic benefit and convenience. You can see why the nutters concerned about 15 minute cities are getting as paranoid as they are. They can see the panopticon being constructed around them and are starting to draw conclusions that are in reality part-truths, but nonetheless concerning enough that they start to argue against outcomes which are actually beneficial on a range of levels (lower emissions, walkability, better mental health outcomes, more affordable etc). The futurologists grifters have a lot of explaining to do. Public policy people need to grow their understanding and help bring the public along for the ride, and ideally help them set some of the boundaries before 1984 beds itself in. Even in a rule of law society like Aotearoa, it could be very easy to slip into a state where surveillance is accepted as a necessary evil to protect us. E.g. if the perpetrator and victim communities of the Christchurch attack had been flipped, I could imagine a lot more kiwis being accepting of increased surveillance. I hope something like this doesn't become the flashpoint you're alluding to. Kiwis were so willing to chastise returnees to home during Covid-19 by labelling quarantined people as "other". This mass acceptance of othering came about very quickly.
Thanks for this article!
It's well-balanced and fact-based. There's plenty of information in here to get me thinking and talking with others.
A super-important topic!
"...when it comes to smart cities, I can see no clear line between what we might consider acceptable and unacceptable use of this technology..."
DEFINITELY, society must continue the discussion. And your thoughts on it make so much sense.