42 ... the not-too-distant future [Fiction]
And that was the end of Rhea's patience. She did what she felt she had to do, and what had been the way for three decades now, and would be the way forever more. She leaked the video with all the identifying characteristics but the Every's participation opaque. The video was a sensation, viewed a hundred million times in a week. She authorised more; they made one each day, and pushed them out through a variety of cloaked accounts.
Each video began with an exterior of the home - this was easy to find, given the Every had photographed every American home, from satellites and the street, multiple times. The audio caught by HereMe then began, with transcribed words scrolling over the home in white type. When a given speaker began talking, his or her face appeared and could be identified in seconds. When offending words were spoken, when the conversation escalated and tensions rose, the POV switched to the police dispatcher, who, alerted by the AI, began listening. Squad cars were sent and the POV switched to their bodycams, and the view from the local surveillance blimps (most self-respecting cities had blimps). The slam of car doors, the rush to the front porch. Now the video had both perspectives - the audio in the home and the video outside. They were merged into a tense and cinematic cross-cutting confrontation, ending with the arrival of Social Services, the saving of the child or children, the arrests of fathers or uncles - in one case a grandmother - and finally a coda enumerating charges and court hearings pending.
The clips were hugely popular; the most dramatic of the first batch was the most-watched video for eight days, amassing 420 million views. The father in that particular instance was caught screaming threats and obscenities at his eight-year-old twins, was arrested and kept in jail for seven days before being released on $500,000 bond. The district attorney, though, had no evidence to go on beyond the loud threats and loud voices in the HereMe audio. It was not against the law - not yet - to yell inside a home. The twins had not yet been abused, it was determined.
Still, a new kind of justice was done. What the letter of the law could not or would not do, the public would. The father was fired from his job the following Monday. On Tuesday, the mother - who public opinion determined was complicit - was fired from hers. The nation seemed satisfied, and while the legality of HereMe, SaveMe was being worked out, the program proceeded at an urgent pace.
Partner police departments were identified - eighty-eight of them in cities large and small, with some sorting for those with higher-than-average instances of domestic violence and child abuse. The departments were given link-ups to HereMes in their towns and the program triggered hundreds of police visits. In some cases the AI was hearing voices from television, music, video games and even audio-books, and this provided much helpful information for HereMe's programmers.
It was not perfect, no, but the AI was still learning, and of the six hundred and nine visits that first month, fully eleven of them yielded actionable results. In three cases, siblings were fighting and those were settled after the children had spent a month or so in foster care. In two cases, parents and children were rehearsing plays, and these situations were explained after single nights in jail and effective lawyering. The take-away, though, was that in six instances, real trouble was likely prevented. "I'm gonna kill you" was heard in three cases; "You're getting a beating" was recorded in two. The crack of a belt was correctly ascertained in one.
From the public, Delaney expected a deluge of resistance. There was something off-limits, she was sure, about the home - something far beyond the reading of emails, or the surveillance on the street, or the presence of cameras in taxis and subways and libraries and stairwells and schools and restaurants and bakeries and offices and government buildings and groceries and corner stores and boutiques and candy shops and movie theatres and the DMV and art galleries and museums and hospitals and retirement homes and boat-supply retailers and off-track betting centres and chiropractic practices and hotels and motels and vape shops and public bathrooms.
The home, though, was different. She expected a hundred million people a day to do what she'd done at her old place with Wes - she expected a mass tossing-out of HereMes in one global show of disgust.
But this did not happen. Instead, people saw the wisdom in it. They saw the gains in safety and security. They wanted to show their virtue by demonstrating it day and night, to the AI listeners.
People grew quieter at home. They were more careful with their words. They did not yell at their spouses or children. They did not threaten. Sex became quieter, laughter more cautious. Those who shrieked when they laughed or sneezed or came found a way to suppress their noise-making. The happy screams of children confused the AI for a long while, and brought authorities to a few million homes before the machines learned. By then, children knew to be quieter - or, better yet, just quiet.
And only the most lunatic and criminal attempted abuse. The world grew safer for all humans in weeks, and would grow exponentially safer in the years to come. Just as the insertion of microchips into children had eliminated all but a few child abductions, the universal adoption of HereMe would guarantee the safety of children whenever they were required. Which was everywhere.
It would begin with private companies. They would require employees with children to install and keep HereMes awake in the home. Churches would follow suit, then private schools. Homeowners' associations would have no choice but to require them, too, then co-op boards and landlords. Then hotels, motels and vacation rentals. Outside the obvious issues of child safety, it was a liability matter, too. Towns and states, and finally nations, would find ways to make them mandatory, and after some desultory legal opposition, they would become ubiquitous and beloved in most every corner of the globe, giving humanity a new sense of control and safety, and all this would be vastly improved - and the human race far closer to perfection - when HereMe added video, and this became law, too.
Dave Eggers, The Every, p. 436 - 440. Hamish Hamilton, 2021. This novel explores what might happen in the not-too-distant future when the world's largest search engine/social media company merges with the planet's dominant e-commerce site. It's called The Every.
HereMe - a fictional application for smartphones and smart-speakers created by The Every. Its algorithms enable the AI to discern whether domestic violence or child abuse is occurring in real time.
Delaney - the protagonist of the story
I'm almost at the end of this fascinating novel. It has raised so many disturbing and deeply provocative ideas and issues, some of which we are already on the path to accepting. We need only to think of video door-bells, fitness and health tracking devices and software, social media platforms and their ability to collect data without our knowledge or even consent, the ubiquitous use of Google (so much so that this company's name is now a common verb).
The extract I have shared is but one of the many scenarios that Eggers uses in this cautionary tale that questions the path humanity is taking. Apart from a personal view that the plot doesn't drive forward all that much and that we are not really surprised by the ending, Eggers questions the alacrity with which we accept anything new with no critical thinking or pause and this is even more worrying.
Isn't it?
Do we really want to be free or do we just not understand what that means?
[Image from Wikipedia]
Comments
Post a Comment