Smart speaker reads emotions to play music
Work & Lifestyle
Moodbox is an emotionally intelligent speaker system that uses artificial intelligence to learn to pick the best music.
We have already seen headphones that stimulate nerves to let users feel their music. Now, a new speaker system is claiming to use human moods to learn what tunes users want to listen to.
Moodbox uses its artificial intelligence system to work out what music users most want to listen to. Users interact with the Moodbox by voice control, and can keep a diary of their moods using the speaker’s smartphone app. The system can work out what music to play by listening to what users say and their tone of voice. Then, depending on the situation or time of day, Moodbox can make music choices, reacting to, for example, how the user feels first thing in the morning, or when they are coming home from work.
Hong Kong based-developers Ivo say Moodbox uses an emotional intelligence program known as Emi. Emi learns different moods and plays music and lighting accordingly. The program uses AI to analyze the voice of the user and their musical preference, then picks out the perfect tracks from millions of songs and lyrics. The speakers are wireless and fully voice-controlled, and a number of Moodboxes can be set up in a house and linked together, playing music at the same time.
Moodbox has already surpassed their crowdfunding goal on Indiegogo, raising more than USD 50,000 from nearly 300 backers.
How else can artificial intelligence be used to improve the way we listen to and discover new music?
8th April 2016
Email: info@ivo.hk
Website: www.mymoodbox.com