As you may recall, just a few weeks ago at the beginning of June, I was honored to host the 3-day virtual RT-Thread Global Technology Conference 2023.

Just to remind ourselves, RT-Thread is an open-source, neutral, and community-based real-time operating system (RTOS). Featuring very low resource occupancy, high reliability, and high scalability, RT-Thread can be used in sensing nodes, wireless connection devices, and a wide variety of resource-constrained and high-performance applications. With its rich middle-tier components and great hardware and software ecosystem, RT-Thread may also be considered to be an IoT platform in its own right, with almost every key basic component required for IoT devices, such as network protocols, file systems, low power management, etc.

To describe the agenda as a cornucopia of awesome sessions would be to understate the case. One concept that stuck in my mind was when the presenter said, “AI is the new UI” (that is, artificial intelligence is the new user interface). As I mentioned in one of my recent The End of the Beginning of the End of Civilization as We Know It columns, we’re already starting to see this at the browser level. As a result, in the not-so-distant future when browsers are fully equipped with generative AI capabilities (like ChatGPT), searches will become more like conversations.

Of course, it helps to define what we mean by “User Interface.” When they hear this term, a lot of people think of things like keyboards, mice, and display screens. While these are certainly examples of UIs, a more embracing definition would be “the space where interactions between humans and machines occur.”

When you come to think about it, this covers a whole lot of ground. In my recent column AI-Augmented Earbuds That Read Your Mind, for example, I introduced the guys and gals at a company called GreenWaves whose mission it is to design, develop, and market extreme AI+DSP+MCU processors for energy-constrained devices (in this context, an energy-constrained device may be something tiny, like an earbud that must keep running for a day or more on a single charge).

One of the projects they are working on is earbuds for use in hearing aids. In addition to acting as regular TWS earbuds, these will have sensors that can pick up your brainwaves. Imagine someone who is hard of hearing at a cocktail party where multiple people are talking at the same time. These earbuds will have DSP processing power sufficient to disassemble the sound space into individual voices. Furthermore, the earbuds will be able to monitor the user’s brainwave signals, employ AI to determine which of these voices is of particular interest, and then selectively boost that voice whilst fading back any extraneous sounds including any other speakers. Pretty amazing, eh (especially when you wrap your brain around the fact that this falls under the UI umbrella)?

As another example that’s a bit closer to home, earlier today (as I pen these words) I posted my Say Howdy to High Fidelity Software-Defined Sensing for the 21st Century column. Unlike today’s capacitive touchscreens, which are slow (they scan each row and column in turn), noisy, require high voltages (a 32” touchscreen requires ~20V), and involve a lot of wibbly-wobbly analog “stuff,” my “Say Howdy” column introduces the chaps and chapesses at a company called SigmaSense whose technology can scan every point on a 32” touchscreen simultaneously at 300Hz using only 0.8V signals. This technology can also sense things happening throughout a 3D volumetric space in front of the display, making it ideal for use with AI-powered gesture identification and recognition.

All I can say is that we truly do live in exciting times. Also, I think things are going to get much more exciting in the not-so-distant future. I only hope that they stay “exciting” and don’t transition into “terrifying” (see also my columns on The Artificial Intelligence Apocalypse).

But enough about me. Let’s talk about you. What do you think about me think about all of this? As always, I welcome your sagacious comments and insightful questions.