Integrate ChatGPT into Siri to make your Apple voice assistant 100x smarter

We’ve all been there: you ask Siri a question, and it responds with the ever-frustrating “Sorry I didn’t understand that”. It could be an accent or dialect problem, the fact that Siri isn’t trained on the vast volume of data that Google’s AI is trained on, or just that Apple absolutely dropped the ball on Siri. Apple launched the voice AI as an app almost 13 years ago, although Siri today still feels noticeably dumb and unhelpful even after more than a decade. Google’s voice AI seems to overwhelmingly be the most popular choice nowadays, although there’s a new kid on the block that’s absolutely eating Google’s lunch, at least in the search department.

Unveiled less than a year ago, ChatGPT from OpenAI took the world by storm for its incredible natural language processing capabilities, hitting a million users in just 5 days, and 100 million users in just two months (that’s faster than the growth seen by social media giants like Facebook, Google, and even Snapchat). ChatGPT’s intelligent and human-like responses make it the perfect AI chatbot, especially given that it really understands natural sentences much better than most other AI tools, and it’s most likely to respond with a helpful answer than an apology. Developer Mate Marschalko saw this as a brilliant opportunity to integrate ChatGPT’s intelligence with Siri, turning it into a much more helpful voice AI. With a little bit of hackery (which just took him about an hour), Marschalko combined Siri’s voice features with ChatGPT’s NLP intelligence using Apple’s Shortcuts feature. The result? A much better Voice AI that fetches better search results, offers more meaningful conversations, and even lets you control your smart home in a much more ‘human-friendly’ way… almost rivaling Tony Stark’s JARVIS in terms of usability. The best part? You can do it too!

Marschalko lists out his entire procedure in a Medium blog post that I definitely recommend checking out if you want to build your own ‘SiriGPT’ too, with an approach that required absolutely no coding experience. “I asked GPT-3 to pretend to be the smart brain of my house, carefully explained what it can access around the house and how to respond to my requests,” he said. “I explained all this in plain English with no programme code involved.”

The video above demonstrates exactly how Marschalko’s ‘SiriGPT’ works. His home is filled with dozens of lights, thermostats, underfloor heating, ventilation unit, cameras, and a lot more, making it the perfect testing ground for possibly every use case. Marschalko starts by splitting up his tasks into four distinct request types. The four request types are labeled Command, Query, Answer, and Clarify, and each request type has its own process that GPT-3 follows to determine what needs to be done.

Marschalko’s AI is significantly better at processing indirectly worded commands.

Where the magic really unfolds is in how even indirect requests from Marschalko are understood and translated into meaningful actions by the assistant. While Siri and other AI assistants only respond to direct requests like “turn the light on”, or “open the garage door”, GPT3 allows for more nuanced conversations. In one example, Marschalko says “Notice that I’m recording this video in the dark, in the office. Can you do something about that,” and the assistant promptly turns on the light while responding with an AI-generated response instead of a template reply. In another example, he says “my wife is on the way driving home, and will be here in 15 minutes. Switch lights on for her outside just before she parks up”, to which the assistant responds with “The lights should be turned on by the time your guest arrives!”, demonstrating two powerful things… A. The ability to grasp concepts as complex as ‘wanting to switch a specific light on after a delay of a couple of minutes’, and B. Responding in a natural manner that conveys that they understood exactly what you wanted to be done.

Marschalko hooked all this into a shortcut called Okay Smart Home, and to power it, all he had to do was activate Siri and say the name of the shortcut (in this case “Okay Smart Home”) and then begin talking to his assistant. The four request types basically allowed Marschalko to cover all kinds of scenarios, from controlling smart home appliances with the Command request to asking the status of an appliance (like the temperature of a room or the oven) with the Query request. The Answer request covers more chat-centric queries like asking the AI for recommendations, suggestions, or general information from across the web, and the final Clarify request would allow the AI to ask you to repeat or rephrase your question if it was unable to detect any of the three previous request types.

Although this GPT-powered assistant absolutely runs circles around the visibly dumber Siri, it doesn’t come for free. You have to set up an OpenAI account and buy tokens to access its API. “Using the API will cost around $0.014 per request, so you could perform over 70 requests for $1,” Marschalko says. “Bear in mind that this is considered expensive because our request is very long, so with shorter ones you will pay proportionally less.”

The entire process is listed in this Medium blog post if you want to learn how to build out your own assistant with its distinct features. If you’ve got an OpenAI account and want to use the AI that Marschalko built in the video above, the Okay Smart Home shortcut is available to download and use with your own API keys.

The post Integrate ChatGPT into Siri to make your Apple voice assistant 100x smarter first appeared on Yanko Design.

Kohler’s new Alexa-enabled bath-fitting showers you with water as well as music

Kohler seems to have taken a massive liking to bathroom-singers with its new showerhead. With a halo-shaped design, the Kohler Moxie Showerhead allows you to fit in a wireless speaker into its negative cavity, giving you a luxurious Kohler-worthy shower with handpicked (or rather voice-picked) tunes to accompany you as you bathe.

Now the Moxie isn’t a new product. Kohler released the quirky showerhead+speaker combination in as early as 2012, but the new update (to be showcased at CES2020 next week) allows Moxie to communicate with Amazon’s Alexa voice AI, allowing you to ask it to play songs (or karaokes), brief you on the news, or order you some more shampoo. The Moxie speaker is detachable and docks right into the torus-shaped showerhead using magnetic action. The Moxie speaker is also completely waterproof, with an IPX67 ingress rating, and is tuned specifically to work seamlessly over the sound of gushing shower-water. The speaker comes with a playback time of 6-7 hours (enough to cover a week’s worth of long-baths), and along with the showerhead, should be available at the end of this year, and could cost anywhere from $99 to $229 depending on the options.

Designer: Kohler

Zungle’s Viper 2.0 is redemption for bone-conduction headphones

I’ve always been a proponent of new technology, but if you’ve read my previous pieces on bone-conduction headphones, you’ll know that I’m a skeptic. The technology has a long way to go before it can replace the airpods in your ear. The earphones I’ve tried before made great promises, but failed to deliver, with expensive price tags and an audio that clearly didn’t match up to the hype. Bone conduction earphones are messy, tinny (with a very higher-frequency-focused sound), and often don’t even align with the bones in front of our ears because they’re designed as regular headphones, when they should be designed completely differently from the bottom up.

That’s where Zungle sparked my interest. Adding bone-conducting headphones to eyewear seemed like an innovative strategy, because on paper, it made sense. Headphones come undone and slip out of place, but spectacles barely budge from their position. Spectacles are also a much more covert way to listen to your music without having everyone know, and besides, the wayfarer styling looks rather cool. People with prescription glasses can easily get their powered lenses fitted into Zungle’s bone-conducting musical spectacles.

viper_2_1

With its cool-boy wayfarer styling, the Viper 2.0 from Zungle is a complete looker. As far as the aesthetics go, there’s little to complain about, with its reliable build quality, mercury-mirror lens coating, and impressively lightweight design. The sunglasses come with the bone-conducting earpieces that rest rather reliably against your sideburns, delivering audio to you through your temple-bones, allowing you to hear music as well as ambient sounds around you. Given the way the earpieces are integrated into spectacles, they A. seldom slip out of place, and B. don’t need a manual to teach you how to wear them (a problem most newbies face with bone-conducting earphones, oftentimes placing the earpieces INSIDE their ears instead of in front of them). The audio quality seems to be remarkably better than other earphones I’ve tried out, which can only be a good thing, although the low-end frequencies are still weak because of the technology’s constraints as well as the fact that you’re also listening to a lot of ambient noise around you.

viper_2_4

viper_2_5

While, like I said earlier, bone-conduction has a long way to go before it replaces those airpods people wear, Zungle’s Viper 2.0 is capable of functionally matching up to them. Right near the hinge you’ve got controls that let you toggle playback as well as volume, but Viper 2.0’s pièce de résistance is its Voice A.I. button that lets it trigger Siri or Google Now right in your spectacles, allowing you to use voice search from your sunglasses (#SiriInYourSunglasses), while an in-built microphone picks up your voice commands, seamlessly letting you talk to your phone’s native AI the way you would with your smart wireless earbuds. In-built Bluetooth 5.0 helps the sunglasses connect and communicate rather swiftly with your phone, so there’s absolutely no lag or any chance of your device getting disconnected.

viper_2_2

viper_2_3

The Viper 2.0 comes with proprietary chargers that fit on the ends of the sunglasses (they use rather classy contact-points rather than the plebeian MicroUSB solution) and boast of a battery life of 4 hours. A probably under-appreciated detail is the charging accessory that can attach to your spectacles rather comfortably even while you’re wearing them, sitting around the back of your head, obscured from view.

viper_2_6

Aside from surreptitiously listening to music while traveling, or at the beach (the Viper 2.0 is sweat-resistant), the Viper with its Voice AI triggering switch quite easily replaces the need to wear your airpods (or android earbuds) and your sunglasses separately. The audio quality is well suited for mid and high-frequency audio, working rather well with human voices (simply perfect for podcasts and audiobooks), although one must solemnly swear to never walk into an exam wearing these! The Viper 2.0 also makes a great case for navigation, making it perfect for wearing while riding a two-wheeler and having audio navigation from your maps app narrated to you. The obvious pro there is that not only can you hear cars and other vehicles around you, but you also don’t have to look away from the road and down at a mobile display for guidance… and you can turn the Zungle Viper 2.0 into a makeshift boombox too, by simply placing its bone-conducting modules against materials like boxes or containers, allowing it to work like a rudimentary echo chamber. Let me know when your truly wireless earbuds (or your sunglasses) are capable of being this fashionable, functional, or multi-purpose!

Designer: Zungle

Click Here to Buy Now (YD Readers get a $10 discount using the Coupon Code: 10off)

viper_2_10

viper_2_7

viper_2_8

viper_2_9

Click Here to Buy Now (YD Readers get a $10 discount using the Coupon Code: 10off)