A new patent that Spotify received earlier this month revealed that the streaming service plans to offer music by listening to users’ voices.
Spotify, the world’s largest online music service, has patented a technology by which users can make much, more personalized music recommendations. According to the patent applied in 2018 but approved as of January 12, Spotify will be able to make music recommendations based on the emotional state, gender, age or accent of the users.
Spotify’s technology is based on analyzing the sounds coming into the microphones of the device where the application is used. Spotify, which will analyze the moods of the users and even the environment they are in (such as alone, with a group of friends or at a party) according to the voices, says “It is common practice for media streaming applications to provide features that provide personalized recommendations to users”.
Spotify will analyze users’ emotions by listening to their voice
Spotify feels that the current method it uses to provide users with personal recommendations is unsatisfactory. The reason is that users have to provide a lot of information about their age, gender, favorite artists. The new technology will be able to understand whether the users are happy, angry, sad or neutral by analyzing their tone of voice, the level of stress in the voice and the rhythm of the voice.
Spotify; He states that metadata such as emotions, gender, age, and accent are only examples, and many other characterizations and classifications can be used. For example; The method of analyzing the songs played by the currently used users and the musical tastes of their friends will also be integrated into the new technology. As technology may cause more users to choose Spotify, it also has the potential to bring some ethical controversy, Spofity’s research team is aware of this.
Warning about the patent, the Spotify R&D team states that the technology should not be put into practice without considering ethical discussions. Stating that they are aware of the fact that people’s digital backgrounds are extremely personal and sensitive, the team stated that possible abuse should also be taken into account, and that they refuse research and practices that violate ethical values and that are not transparent about the privacy of users.