The saying goes that our eyes are the window to the soul. Perhaps over time they’ll serve a less romantic purpose, as windows to making money.
Researchers at Carnegie Mellon University in Pittsburgh, one of the leading institutions for artificial-intelligence research, have embarked on a study using facial-recognition algorithms to track the expressions of traders. Their goal: finding correlations between mood swings and market swings. If the traders look enthusiastic, it might be time to buy. Are there more furrowed brows than usual? Could be time to sell. The provisional US patent application was filed on Sept. 13th, 2022.
“The market is driven by human emotions,” says Mario Savvides, the project’s lead scientist. “What came to us is, can we abstract things like expression or movements as early indications of volatility? Everyone is getting excited, or everyone is shrugging their shoulders or scratching their head or leaning forward… Did everyone have a reaction within a five-second time frame?”
The main phase of the study will take place over 12 months beginning in the third quarter of 2023, and involve about 70 traders at investment firms mostly located in the US. They’ll all have cameras mounted on their computers to record their faces and gestures throughout the day, according to Savvides. The cameras will be linked to software from Oosto, an Israeli company formerly known as AnyVision Interactive Technologies Ltd., which hopes to develop an alert system for trends in traders’ faces, or a volatility index it can sell to investment firms.
Oosto, which makes facial-recognition scanners for airports and workplaces, declined to name the firms in the study but said those companies would get early access to any new tool that is spun out of the research. Footage of each individual will stay on their own computer or their physical premises; only data and numbers representing their expressions and gestures will be uploaded to the researchers.
A person’s face is made up of 68 different points that frequently change position, according to Savvides, who co-authored a study on facial “landmarks” in 2017.
His system will also track a trader’s gaze to see if they’re talking to a colleague or looking at their screen, and note if their peers are doing the same thing. “We have a whole toolbox of searching algorithms that we’ll be testing to see if they correlate to a market signal,” said Savvides. “We are searching for needles in a haystack.”
Advertisers already use facial analysis to study how exciting an ad is, while retailers use it to see how bored customers are and hiring managers to determine, rather creepily, if a job candidate is enthusiastic enough.
The stock market study at first glance seems more dystopian. Trading algorithms have for years tried to harness information from the weather, social media or satellites, but there’s something a little demeaning about the traders themselves being exploited for data. The researchers are also arguably putting traders into a never-ending feedback loop where their actions and decisions become derivative and their notoriously lemming-like behavior amplified. If you thought the market was already driven by a herd-like mentality, this will probably make it worse — but that’s also how the market works.
“Everyone on the street talks,” says one trader in London (not part of the study) who said they’d find such alerts about their peers’ sentiment useful. “The whole part of doing what we do is discuss ideas and share information… Non-verbal communication is massive.” Years back, trading floors were loud places where people would often talk on three or four phone lines at the same time; now many communicate over chat rooms and talking is minimal.
But the study also points to another uncomfortable phenomenon: Facial recognition is here to stay and its more controversial cousin, facial analysis, might be as well. For all the concern that has bubbled up around facial recognition, including over the mistakes it can make as a surveillance tool, tens of millions of us still use it unhesitatingly to unlock our phones.
Facial analysis like the kind being used by Carnegie Mellon, opens a bigger can of worms. Last summer, Microsoft Corp. vowed to eliminate its facial-analysis tools, which estimated a person’s gender age and emotional state, admitting that the system could be unreliable and invasive.(1) That might not matter too much for traders, eager to lap up whatever data they can for an edge. But this study — if successful — could embolden research into analyzing faces for other purposes, like assessing one’s emotional state during a work meeting.
“If you’re doing a business deal over Zoom, can you have an AI read the face to tell if someone is calling your bluff, or being a hard negotiator?” asks Savvides. “It’s possible. Why not?”
Zoom Video Communications Inc. introduced a feature last year that tracks sentiment in a recorded work meeting. Called Zoom IQ, the software targeted at sales professionals gives meeting participants a score of between 0 and 100, with anything over 50 indicating greater engagement in the conversation. The system doesn’t use facial analysis but tracks speakers’ engagement, or how long one waits to respond, and offers its score at the end of the meeting.
More than two dozen rights groups have called on Zoom to stop working on the feature, arguing that sentiment analysis is underpinned by pseudoscience and is “inherently biased.” A spokesperson for Zoom said the company still sells the software, and that it “turns customer interactions into meaningful insights.”
You can argue that Carnegie’s researchers shouldn’t care what their facial-analysis tool tells them about traders’ emotions; they just need to spot the patterns that point to correlations, and pivot those figures into a searching algorithm. But the downside of turning emotions into a number is just that: It risks devaluing one of the most fundamental features of being a human. It might be better if it doesn’t catch on.