Can a computer predict if your partner’s a cheater?
A sophisticated learning algorithm is pretty good at predicting infidelity. And that should be concerning to all of us.
It’s been exactly twenty years since Steven Spielberg’s future-based action film Minority Report came out in theaters. In the dystopic thriller, members of the “precrime” police force can actually predict when someone will commit murder and arrest them before they do it. The twist of the plot comes when a member of the precrime division, played by Tom Cruise, sees that he is going to kill a man he’s never met. And a battle between determinism and free will unfolds.
Though we still might be far from the precision of predictions portended by Spielberg, increasingly complex algorithms are being used every day to make predictions of human behavior. How you vote, what you buy, which credit card you’ll choose — behind-the-scenes machines are tracking and predicting your daily decision-making. And these analyses feed into companies’ marketing campaigns, driving what you’re exposed to in the future.
Harnessing digital data in daily life
In essence, your own tailored world is being created for you by clever statisticians, and the sources of this data are ever expanding. We have phones that track our physical locations and every move we make online. Networks of shopper loyalty programs and credit cards track what we buy. And the applications and technologies we often opt into, from health apps to Apple watches, even track the workings of our human bodies, including our activity levels and heart rates.
In a way that’s slightly reminiscent of Minority Report, this digital trace data is increasingly making its way into the judicial system. Season one of the popular Serial podcast showcased both the utility and downfalls of digital data when Adnan Syed’s cell phone tower pings were first crucial to the state’s case against him in his conviction and then criticized for their unreliability on appeal. But cell phone tower pings are just a small piece of our exceptionally deep web of digital traces.
Data on heart rate and activity from an Apple watch were used as evidence of time of death in a recent murder trial. And Fitbit heart rate data established the presence of a suspect in another death case, leading to his being charged with murder. Considering the weight of tangible physical evidence in the courtroom, this trend of using digital traces in litigation is likely to grow.
Outside the courtroom, researchers are also interested in predicting other types of human behavior, and they now have increasingly complex statistical programs in their arsenal to do so.
An algorithm for cheaters?
In a recent study involving researchers from the US, the UK, and Switzerland, the goal was simple: predict who was unfaithful. Building on previous work that used less sophisticated logistic analyses, these researchers found that their random forest machine learning algorithm could predict infidelity better than chance, and they also pinpointed some common characteristics of those who cheat.
Regarding the characteristics of cheaters, their findings were not too surprising. Sexual desire — both with others and solitary (e.g., masturbation) — was one of the strongest predictors of cheating. Moreover, in support of past research, relationship variables were more likely to predict cheating than individual characteristics. Those who were more sexually satisfied, more in love, and more satisfied in their relationship overall were less likely to cheat. That said, there was also a surprisingly high number of people who were very satisfied in their relationship who cheated. This means if relationship satisfaction were entered in a cheating model, it might produce quite a bit of error. In other words, the algorithms have shortcomings.
Yet it’s the promise of these types of analyses that is both fascinating and frightening.
Consider this latest cheating algorithm. Perhaps it isn’t very accurate right now, but it’s a machine learning algorithm, which means it gets updated as it’s exposed to more data. So accuracy might improve significantly if we feed more data into the algorithm. We might improve accuracy even further if we add some temporal self-report measures, like relationship history, and digital traces, like social media behavior. The possibilities are seemingly endless.
And for behaviors such as infidelity, an algorithm with good predictive validity might be in high demand. Millions of people cheat. Estimates of the prevalence of infidelity vary wildly, with anywhere from 20% to 80% of adults reportedly engaging in infidelity in their lifetimes. And it often has devastating effects on a couple. Trust is broken. Someone is usually deceived. And the vows of fidelity that signify commitment are often so strained that conflict and relationship dissolution are common outcomes.
So if you could avoid all that by using a simple algorithm to calculate the risk, would you? And if you find out that your partner has say, an 90% chance of cheating, what would you do? Would you even want to know in the first place?
The future of algorithms is upon us
You may have to make this decision sooner than you think. These inventions are already here. And like many of our technological innovations, the science that enables us to create and apply these algorithms has emerged before the science that measures their effects on humanity. This is unfortunate, but it’s necessarily so. You can’t measure the effects of a technology before it’s invented. However, this is where visionaries like Spielberg come into play. There are enough examples of the dystopic future of humans and tech in mainstream media for us to start thinking seriously about digital ethics.
The court systems are already addressing some of these issues. Last fall, a federal judge ruled that only a human, and not an AI machine, can be the owner of a US patent. And state laws are being created to address issues related to self-driving cars. Prominent scholars, like Nathalie Rébé and Ryan Calo, are weighing in too on robot and AI policy and ethics. They suggest that we need to make active decisions about how much AI and robotics will be integrated into modern society and what laws and regulations need to be shaped around this integration.
At first blush, the use of complex algorithms to predict cheating might seem harmless as compared to a robot owning a patent or self-driving cars that can make their own decisions. But for those of us who saw Minority Report, these “harmless” algorithms might be the scariest inventions of all.