New AI system analyses the way you speak to predict relationship outcomes
The system proved better than experts, delivering nearly 80% accuracy in a study involving 134 couples.
Scientists have built an artificial intelligence system to predict how long a relationship would last based on voice recordings, according to a study published in the scientific journal PLOS One.
The machine learning algorithm, developed by researchers from the University of Southern California, uses the tone and intensity of your voice – features like pitch, variation in pitch and intonation – to predict what will be the eventual outcome of your relationship.
Though the system doesn't consider the words spoken (literal meaning) or the body language of the speaker, it managed to outperform human predictors in a recent study that was conducted using vocal recordings of 134 couples in distressed relations with an accuracy of nearly 80%.
First, the researchers used therapy session recordings of these couples to extract their vocal features. Then, they fed these attributes into the machine learning system, along with information on the lifespan of each relationship. The algorithm was then programmed to learn the relationship between the vocal features, studying who spoke when, for how long and the sound of their voices.
Looking at the patterns encoded in the way that they spoke, the algorithm managed to predict the outcome of the couples' relationships with relatively high accuracy. In fact, it was even better than experts who assessed not just the vocal tone but also the words used by the parties. To be more specific, against the AI's success rate of 79.3% , the experts could correctly predict 75.6% of the cases.
While this system could give couples some time to realise that their tone could be one of the reasons for their relationships not ending well, it also gives insights into how our way of speaking could contain more information about our underlying feelings than we are aware of.
© Copyright IBTimes 2024. All rights reserved.