Being a smart home is no longer enough. At CES 2018, we are seeing truly smart products that have Artificial Intelligence (AI). Forget simple voice commands or setting up routines with smart home appliances, devices or lighting. AI communicates with the cloud, detects and learns your patterns, then automates your home based on your habits. It knows which household member is speaking to it and adjusts to each person’s preferences. Like an efficient assistant (think, Radar on Mash ), it can predict what you want and need even before you ask for it, making your environment comfortable and automating tasks without you telling it what to do.
ThinQ AI by LG
LG has committed to AI in a big way this year, announcing that all of their appliances will be able to be controlled by their ThinQ AI. Using LG’s DeepThinQ 1.0 technology, devices can recognize your voice and, through video, sensor and human body detection, learn and refine tasks and settings through analysis over time. Tell it “Good morning,” or lock a door when you are leaving and it can complete a number of tasks, like turning off lights, running the robotic vacuum while you are out of the house and turning on the air purifier before you return.
LG washers learn how you like to wash certain types of clothing and automatically apply those settings. Your washer would then talk to the dryer, which has learned how you like to dry each type of load. Say “wash sports clothing” and your running shorts won’t accidentally shrink in the dryer. The washer and dryer also communicate with the cloud to make adjustments to the wash and dry cycle based on changes in the weather and air quality, conditions most of us wouldn’t even consider when starting our laundry. And all of the commands can be spoken to the LG CLOi Hub Bot (shown at the top).
With AI, LG air conditioners know who is in the room and can adjust the temperature based on their preferences. You’ll always be comfortable and you never have to think about it.
The refrigerator is one of the main hubs for ThinQ. The touch panel in the door can show what you have in the refrigerator, put items on a shopping list and show expiration dates for foods in the fridge. Ask the refrigerator what is expiring soon to use up food before it goes bad. Need ideas? The fridge can call up recipes based on the food you have on hand and your individual past food preferences.
In the car, ThinQ on your phone tells you what you need at the grocery store so you won’t forget to buy eggs on your way home. It will also be able to read the expression on your face. While driving it will detect when you are getting drowsy and alert you to wake up. Eventually, ThinQ will be able to play music and adjust the temperature in the car based on your preferences.
Bixby Controls Samsung’s Smart Home
Samsung has committed to its SmartThings smart home family of products to make controlling its appliances and devices even easier using AI via Bixby voice control. Bixby recognizes each person in your household and gears its response to an individual’s account or preferences. You’ll be able to control Samsung’s appliances and SmartThings connected devices using a new SmartThings phone app (available this spring) or through the Family Hub refrigerator or Samsung TV. Simply speak your command to whatever you are closest to—phone, fridge, or TV—and you can control your smart home.
The SmartThings app will also be able to set up a new Samsung TV and to download streaming apps and sign in automatically. No more typing on the onscreen keyboard or navigating to activation websites. The Samsung cloud will have all of your information so you simply need to accept the login to Hulu or Spotify on the SmartThings app and the service will be downloaded and activated on the TV. Like the Family Hub refrigerator, the TV can display who is at the front door, bring up the picture from the baby monitor, be instructed to start the washer or dryer, or display recipe options that you might want to make for dinner.
Viaroom Home is a self-learning home controller that is paired with a cloud service to sync all of your smart devices. When you first set it up, it studies your smart home habits for 48 hours then anticipates what you will do next (don’t you hate being so predictable?). It creates a precise map of each room, paying attention to lighting, heating and appliance control. No professional installation is required. If your family is out of the house every day by 8 am, the ViaRoom might mimic your habits and turn off the lights, lower the temperature on the thermostat and lock the front door.
After the initial set up period, ViaRoom monitors for patterns and routines but is able to distinguish a “one-time only” occurrence, like turning on a lamp, from a daily habit. There’s no need to worry about hackers learning your habits or controlling your home as the Viaroom has a specially designed firewall to protect your smart home.
So far, Viaroom is compatible with Philips Hue and Fibraro lighting, Qubino shutters, Aeotec smart plugs, Yale door locks and Foscam security cameras. Along with AI, the ViaRoom can be controlled by Google Assistant and Amazon Echo.
ViaRoom Home will be available for pre-order on January 30, 2018 for $119.
Adding Emotion to AI Assistants
AI doesn’t have to be cold-hearted and impersonal. We are starting to see companies bring an emotional element to our interactions with robots and digital devices. For instance LG has given its Clio robot a cheerful personality and Sony is adding emotions and a unique personality to each of its next generation Aibo robotic dogs. And on a larger scale, EmoShape wants to give digital assistants, robots, computers a personality.
With EmoShape’s Emotion Processing Unit (EPU), personal assistants, games, and avatars can have twelve primary emotions—excite, confident, happy, trust, desire, fear, surprise, inattention, sad, regret, disgust and anger—as well as pain, pleasure, frustration and satisfaction. Very soon, the emotions involved in playing a game will not only be felt by the players, but also by the game itself. This will have a serious impact on the entire strategy of a game.
What’s more, the Emoshape EPU is not only capable of understanding and showing emotion, but the chip is also able to control the different facial expressions and body languages of a robot or an avatar on a computer screen. It brings to mind the movie, Her, where a man falls in love and has a relationship with his computer.
Should AI have an emotional component?
While AI is great, it may not turn out to be a good idea to add emotion to our smart devices. I don’t want to hear flak from my refrigerator when I pull out the ice cream or my voice control speaker to revolt like Hal in 2001: A Space Oddysey. It will be interesting to see how devices with artificial intelligence will react when they get emotional, we do know that AI is here and will be part of smart homes starting this year.
[images: LG, Samsung, Viaroom]