The official moniker of this year’s Sensors Expo is Sensors Innovation Fall Week. This free 3-day virtual extravaganza will feature dozens of speakers from across the sensors industry helping attendees to define, discuss, and understand design challenges and sensor solutions. Each day will feature networking, keynotes, tech talks, breakout sessions, panels, and more — topic areas include the IoT, Wireless Communications, Industrial Applications, Smart Sensors, MEMS, and Next Generation Innovations. If you have a few minutes to spare, you should register now before all of the good virtual seats are taken.
I’m really excited about this. Why? Well, if you settle down, I’ll tell you. The reason I’m squirming around in my seat is that I just heard from the folks who are organizing Sensor’s Innovation Fall Week. It seems tthe little rascals are holding a Spectacular Sensors Smackdown that is open to anyone who has a creative and cunning gadget they wish to flaunt.
What we are talking about here is anything that showcases sensors and switches in action, from highly practical devices to fun and frivolous creations, such as animatronic graveyard displays for Halloween or mechanisms that use motion detection to dispense pet treats, for example.
There are two steps required for you to bask in your 15 minutes of Andy Warhol fame and claim the “Grand Prize,” whatever that turns out to be. The first step is to enter, which involves you filling out a submission form to tell the organizers a little about your gadget — how it works and how sensors and/or switches are used in the design.
Participants for the actual competition, which will be aired online, will be selected out of these submissions based on the novelty and engineering ingenuity of their designs. Five candidates will be asked to create 5-minute demonstration videos showcasing their gadgets and gizmos in action.
These videos will be streamed as part of the Spectacular Sensors Smackdown session — which will take place from 3:50 to 4:20 p.m. Eastern Standard Time (EST) on Wednesday 18 November 2020 — at which time the virtual attendees will vote for their favorite.
Yes, of course I’m entering myself (well, not myself, you understand, but one of my cunning creations). How could anyone expect me to resist something called a “Spectacular Sensors Smackdown”?
I have the perfect project in mind. Do you recall my 12×12 ping pong ball array, where each pixel (ping pong ball) contains a tricolor WS2828B LED? As we see in this video, my most recent experiment with this little scamp was to simulate multicolored virtual drips dropping on the array, where each drip is accompanied by a splash effect.
Did you ever have one of those wooden labyrinth boards when you were a kid? I just found this one on Amazon. The idea is you have a wooden box with a maze sitting on top. On the sides of the box are two knobs you can use to tilt the maze left-right and front-back. The idea is to use the knobs to guide a marble or ball bearing through the maze without it falling through one of the holes.
Well, that’s what I’m planning on doing with my 12×12 ping pong ball array. In this case, my “ball” will be one of the pixels in the array. Holding the array horizontally will cause the “ball” to stay where it is. Tilting the array will cause the “ball” to “roll” until it hits one of the walls or gets trapped in one of the corners where it will stay until the array is tilted in the other direction.

Fortunately, I just happen to have a 9DOF Fusion Breakout Board (BOB) from Adafruit in my treasure chest of parts (I knew it would come in handy for something one day). This bodacious beauty features a BNO055 MEMS sensor from Bosch. In turn, this device boasts a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetometer.
Even better, the BNO055 also includes a 32-bit ARM Cortex M0+ that performs a bunch of mind-bogglingly complex sensor fusion algorithms for you and presents you with motion and orientation data in a form you can use without your brains leaking out of your ears.
But wait, there’s more because the folks at Adafruit also provide all of the libraries and example code sufficient to enable even a doofus to get things up and running. Truly, my cup runneth over.
I will still have some work to do, of course. Initially, I’ll be happy just to get the ball responding to left-right and front-back tilts in the array with it “rolling” at a constant speed. The next step will be to add some “physics” such that the “ball” appears to experience inertia affecting its acceleration and deacceleration; also, that its speed is a function of the amount of tilt.
Next, I can start to add more features, such as colored “food” pixels that randomly appear and disappear. Causing the “ball” pixel to run into a “food” pixel would increase one’s score, where different colors could represent different values. Also, we could add red “hole” pixels; running into one of these would cause the “ball” pixel to disappear. These “hole” pixels could be stationary, or — like the “food” pixels — they could randomly appear and disappear. Alternatively, we could start the game without any “hole” pixels, and then gradually add them to increase the difficultly of the game.
I’m so enthused by this that I’ve just dispatched the butler to fetch my winning trousers in anticipation of my forthcoming triumph. How about you? Can you be tempted to enter with a gadget or gizmo of your own, or do you wish to concede defeat now (LOL)?
Sounds like an interesting application for a motion sensor. Much more interesting than my application which is to detect when something is moving that shouldn’t. Not to mention that I’m not likely to have time to work on it between now and then.
You need to make some time to do the fun stuff 🙂
how about a darts type game? You wear the accelerometer board on your wrist with a wireless transmitter, and it logs your hand movement as you throw a virtual dart. You then use that data to calculate where the “dart” would hit and light the pixel on the array accordingly. With nice colour effects of course. You’d maybe have to calibrate it by holding your hand at the centre of the screen which would light up to tell you when to move to the throwing position (which it would then know from the accellerometer / gyroscope data from the sensor. Mammoth programming effort though, but you wouldn’t want to do something too humdrum…..
I like it — we could combine the accelerometer/gyroscoipe/magnetometer data with machine vision (facial recognition) so that no matter how badly I threw, I got a bulls eye — by comparison, no matter how will someone else threw, they always come one point under me LOL
Coincidentally I just came across this
https://www.embedded.com/optimizing-high-precision-tilt-angle-sensing-accelerometer-fundamentals/
Probably stuff you already know, but for me accellerometers etc are closely akin to black magic (or at least smoke and mirrors…)
“Probably stuff you already know…” I know nothing Colonel Hogan (I’m making thsi stuff up as I go along)
Not facial recognition, but recognition of darts, so it’s a matter of supplying the correct set for each person.
https://www.youtube.com/watch?v=MHTizZ_XcUM
O M G This is awesome!!!
It certainly is. I was in a darts league about 40 years ago, and I would have loved something like this. I’d still like to get one for my basement. Of course, I can’t afford it, but …
What amazes me is the mind of someone who would conceive something like that — very very clever 🙂