Paying to surveill: the new neighbourhood watch

Ruth Coustick-Deal
8 min readDec 14, 2019
Photo by Jeff Fielitz on Unsplash

When tools like Alexa, Google Home, Siri etc, the helpful listening presence in your house, were introduced, one of my biggest concerns was that the record of your personal conversations would be passed along to the police and government.

After all, if the database exists, authorities will ask for access to it. This has been true on everything from your browsing history to your DNA. Amazon’s claim “we only listen when we hear you tell us to” is not only illogical, it’s also already been proven false.

However, the rapid growth of interpersonal surveillance devices has thrown a new twist into these power dynamics.

People are willingly spending money on devices to placate their feelings of fear or mistrust in their community, their family, etc. But these devices only further damage our trust of each other. The tech market feeds on that mistrust, encourages it, to sell us more of the same.

Middle-class people are becoming the “managers of every day surveillance”, as Luke Stark and Karen Levy put it in their essay The Surveillant Consumer, we are all nodes in this network of surveillance.

Introducing Ring
Amazon’s Ring device is a “digital neighbourhood watch”. There is a doorbell camera installed on your doorstep, pointing out at your community, with optional additional cameras to go alongside. The companion app ‘Neighbours’ allows you to share the footage you collect with other people in your neighbourhood, and of course, with the police. Professor Chris Gilliard is doing amazing work collating and documenting examples of how each day the line between cop and company fades away.

Both institutions benefit from working together to create communities of mistrust. Amazon sells more products, and the police gain more power. Here’s how they do it:

  • In the US Ring formed partnerships with more than 225 law enforcement agencies, according to Vice reporting.
  • Police have offered people free or discounted Ring doorbells.
  • Police can ask for access to the footage without a warrant. Of course you can say no, but they’ll know you chose not to share. And Ring coaches police on how to convince people to give their camera footage to police.
  • Ring sent approved social media messages to be used by the police, guidance on what they should say to encourage people to use its products. Guardian reports a spokesperson saying,
    “Hopefully, the department can get a ton of people to download the Neighbors App from your specific link!”
  • They deliberately encourage fear, which is achieved through this partnership. Again, in Guardian coverage,
    “I just wanted to reach out and say great job with the response you made with neighbors commenting about crime going up etc … That’s an exact comment residents need to see coming directly from the department to put things into perspective.”

Ring encourages people to alert the police whenever they see anyone they view as suspicious even passing by their house, and so to be more distrusting of their communities, and of difference within them. Black people just walking by were reported to the police in another neighbourhood watch app, Nextdoor.

Even where it is not about pointing suspicion, these cameras normalise filming other people in your community without consent, like delivery or postal workers, whose rights are disregarded. At Halloween, I saw people sharing cute videos of their neighbours trick or treating, recorded in that fish-eye viewpoint that indicates this is from personal security cameras. The weirdness of posting videos of other people’s kids did not seem to be considered.

What we’ll pay for

The tech press is complicit in selling mistrust

Ring is by no means the only product, or the only way, in which people are no longer just the victims of surveillance, but acting it out themselves.

Amazon know that they can succeed in pushing this product because there is already a breakdown of trust, and exploitable power dynamics, between parents and children, between teachers and children, between adults and their more elderly parents, with technology on the market now.

When it comes to watching children, there is an abundance of apps and tools. In the Reddit forum r/insaneparents teenagers share stories about their parents using stalkerware to keep them under tabs.

Examples range from texting to ask them why they have location off, to using secret cameras to check they are doing homework. Helveticide on twitter pointed out that the subreddit has grown to one of the top 500. Back in 2016 Pew Research said that of US parents, “16% use monitoring tools on their teen’s cellphone to track their location”.

There are stories of US summer camps installing facial recognition cameras, and the children themselves asking camp to take photos of them and put them on Facebook so parents can be assured they are having fun.

Brilliant reporting on school surveillance by Lois Beckett revealed how Gaggle, the app that is used in schools across the US to surveill children, described their software as “preparing kids to be successful adults” because they will be watched when they are at work too.

Some of the most benevolent sounding technologies are surveillance tools targeted at middle-aged adults to watch over their elderly parents. “Watch them to check if they fall” sounds thoughtful. In fact it is often presented that the use of these surveillance tools will allow elderly people to retain independence — if you are being watched, you can stay in your own home rather than be moved into A Home. But there is another element to the advertising, a patronising tone that those we care for should not be able to exist unseen?

Marketing mistrust
Using technology to spy on your partner has become commonplace in cases of intimate partner violence. Women’s Aid provide security advice on how to cover your digital tracks, prominently on their site. The companies that market this kind of stalkerware, such as Flexispy, encourage abuse on their websites with claims like “women are more likely to cheat” or “you deserve to know what your partner is doing.”

Cynthia Khoo, Kate Robertson, and Ron Deibert at Citizen Lab wrote Installing Fear: A Canadian Legal and Policy Analysis of Using, Developing, and Selling Smartphone Spyware and Stalkerware Applications , an incredible effort to log all the routes to prosecute those who would use these tools, in the context of intimate partner violence in Canada. Yet, as they note in the paper, actions that are illegal when targeted towards a spouse are often acceptable when used against dependents and children.

So the same tool that claims a “moral right” to put a key-logger on your partner’s phone has a page for parents, and a page for employers. They offer the same “fear and entitlement combo” — the people you know could be lying to you, you are doing nothing wrong to watch them, and you deserve to watch them.

It’s a package which normalises denying consent, and antagonistic power dynamics in personal relationships. As each one of these products become normalised, does stalkerware still stand a chance of being understood as a horrific thing to use?

Paying to surveill

I used to think that it was a sign of wealth to have freedom from surveillance, so seeing huge numbers of people choose to spend that wealth on paying for tools to use against their communities and family has been disturbing.

However, this is not to claim that we are all having some kind of equal experience.

Surveillance is still used by police and institutions of power predominantly against poorer communities. For example, in The Poverty of Privacy Rights, Khiara M Bridges describes how poor mothers are subject to investigations by the state, and have an experience of surveillance during their pregnancy that is undeniably class and race-based. Virginia Eubanks book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor similarly covers this area extensively, and I recommend both of these books on this topic.

There is, however, a feedback loop between this middle-class surveillance of loved ones and neighbours, and the authoritative surveillance of police and governments which is targeted at poor communities.

The willing and enthusiastic engagement in these personal models of surveillance creates constantly bigger datasets. These are then abused by both corporations and governments. We are back at the start — where a database exists, the government, the police will want to use it. Luke Stark and Karen Levy describe it as the “consumer driven forces of surveillance provide technical and social scaffolding for institutional power”.And that scaffolding is just getting bigger than ever.

Photo by Dan Gold on Unsplash

What does this do to the watcher?

Luke Stark and Karen Levy argue that the idea of watching as a form of care and responsibility has taken hold, and in a gendered way too. As in, if you are not “watching the kids”, you are being irresponsible. So having the latest app to make that watching more efficient? — That’s just good motherhood.

The middle-classes are conned into being “middle managers of surveillance”, playing a perfect role in the capitalist system. The marketing encourages customers to spend money to build a system, without suspecting that they are hurting themselves in the process.

Before reading Stark and Levy’s’s piece I had not considered the question “what does this do to the watcher” because I automatically think of them as an abuser, not a victim. But this damages everyone: relationships and communities and education. As it becomes normal, doing something different is harder, but feeling an intense pressure to watch all the time is not healthy.

Where next
Through selling mistrust, the ideology of surveillance has become part of everyday interactions for families, for schools and for communities.

However, it doesn’t have to be like this. As Rose Evelth points out, Google Glass’ early retirement showed us that lines can be drawn. The public said no to being filmed all the time by strangers, and this was a fantastic reclaiming of norms. As new stories break about how Amazon employees listen to pieces of home conversations on Alexa, there is suspicion rightly placed on the companies selling the products — not on the people we love.

While these products like Ring and Gaggle chip away at the foundations of trust, we need to fight back — by establishing that children are also people; that consent is always necessary, and that surveillance is never (for) your friend.

For thoughtful conversations on topics like this, check out the podcast I co-host on technology and feminism, The Intersection of Things.

--

--

Ruth Coustick-Deal

Interested in all things tech + inclusion | Co-host of The Intersection of Things podcast |