A library full of boilerplate maybe becomes useful?

A while back I wrote an Arduino library to do the tedious bit you add to every bit of code you write for an ESP8266/ESP32 where it tries to connect to the WiFi and waits for it to happen, or not.

This is the most basic stuff and it's in almost every thing I make, frankly almost everything anybody ever makes with an ESP and Arduino. Yet you end up rehashing it and typing it or copy & pasting it. Again and again and again.

So I put it in a library.

It feels like an act of fraud. It's not a fancy connection manager with credential storage. It just tries to connect and feeds back the result. It can optionally stick out a load of diagnostic stuff on the Serial port, which is nice but it still felt like overkill.

Now, inspired by something of Larry Bank's I've made it do geolocation of the device through the https://ipapi.co/ API and it handles the setting of timezone and the system time using NTP.

That is useful as a one-line addition to any network connected project.

A lot of the time people don't bother with their devices 'knowing' the time themselves but I'm a big fan of it and I often use it somehow in my projects.

Useful Sensors Person Sensor testing

One of the things I'd quite like to do with my Rover project is have it detect when a person is in front of it and aim a camera at them. The long term goal is that it's a kind of telepresence device, but this is also another way to avoid collision with people and I want as many checks on that as possible.

When these very cheap sensors from Useful Sensors appeared I grabbed a couple.

Their USP is supposed to be that they offload all the machine learning face detection onto a dedicated module you can just get readings from and will even learn specific faces. In practice I found they seemed to be very prone to false positives so I left them sitting unused for a month or two after an initial burst of enthusiasm.

At LARPCon I had a chat with a droid builder who'd brought one of their in-progress projects along which is part-finished B2EMO replica. 

This droid has a prominent 'eye' camera, although in practice it's just a drone camera much like I've been experimenting with.

This made me think of the Person Sensor again and have another go with it. For testing I attached it to the cheap pan/tilt setup I have the drone camera in and started trying to filter out the false positives from the Person Sensor.

It became clear pretty quickly that the false positives are very 'bursty', ie. you'll get 2-3 false detections in a row then they disappear. With detection happening at around 5Hz, ignoring any detections with less than four consecutive positives seems to pretty much kill them all without adding much latency.

I then set about making it track the 'best' face and move the pan/tilt setup. After a bit of faffing with preventing constant hunting due to lag I eventually got decent enough tracking so long as the lighting is favourable.

Yes you could do this with a powerful SBC like a Jetson Nano and do it better, but this is doing it with a standalone $10 module and offloading the work. I made a little video of it tracking in my cellar. Beware the static on the audio channel, this video is coming from a USB 5.8GHz receiver and the camera has no microphone.

I think this works passably OK now so I'll be investigating this further and seeing about making a more stable pan/tilt mechanism. I may also wrap the Person Sensor functions up into an Arduino Library for the benefit of others. It's not complicated to use but abstracting stuff into a re-usable library is better.



LARPCon 2023

This weekend I helped crew the UKLTA stand at LARPCon 2023, which as the name suggests is a convention dedicated to LARP.

In the UK we don't really have a 100% definitive 'must do' LARP convention but LARPCon is the closest we have.

A lot of it is dedicated to traders and there's a heavy 'fantasy and foam weapons' slant from both the attendees and the stands. So it made our representation of sci-fi/modern/light mil-sim games using Lasertag weapons something unusual.

Overall though it was a very positive thing to do. I ended up spending a chunk of time talking to people with similar interests and had taken a few of my interactive props along. This was a bit of a last minute decision but I'm glad I did.

Old things like ORAC and my Enigma machine were cooed over, they make great 'showboat' props because they fill a small table and are immediately recognisable. So they did the job of helping get people over to talk to us.

I ran into a bunch of people interested in making interactive props who asked for my contact details and I'm hoping they get in touch after the event.

Hoverboard motor powered rover: Part 6

Things seem to have slowed down with work on my rover as I keep disappearing on 'side quests' and one of those was to make a better controller for this and any other projects that need a remote controller.

So I cooked up a 3D printed design with some 3-axis sticks and as many buttons as I could manage on an ESP32. It's partly inspired by the 3D printed controllers James Bruton and Keegan Neave have made but ESP32 based.

With a screen I happened to have fitted and the encrypted 2-device ESP-Now library I'm working on being bidirectional (another side quest) it should be able to give lots of info on the rover while it's in use like battery voltage, location and so on.

This might be of use to others (although the design's quite specific to that screen) so I've made it available on GitHub and christened it Sticky.

LD2410 radar sensors

As I worry about safety around my rover I've been looking for a way to detect people that doesn't rely on anything complicated like AI image analysis and can be done with a simple cheap sensor.

By chance, Andreas Spiess did a brief review of the new Hi-Link LD2410 24Ghz radar sensors, which are designed for presence detection.

They look promising and aren't expensive, so I bought a few along with the dedicated UART breakout board made for them.

My little bit of testing suggests they're really quite good at detecting people within a few metres, whether they are moving or not and people are rarely completely still. At close range it can tell if you're breathing.

While you can just connect one of the pins to a GPIO to signal presence they use a serial protocol for configuration and more detailed readings. However as they are comparatively new devices there wasn't a library for this.

Between reading the work somebody has started on integrating them into ESPHome and the slightly confusing manufacturers datasheet I've cooked up an Arduino Library and submitted it to the Library Manager. I hope people find it useful.

Now I need to make some breakouts for them, they use 1.27mm headers, and attach them to my rover.


Hoverboard motor powered rover: Part 5

I've slowly been chipping away at adding sensors to the rover and it's at a point where I really need to dig deep into RobotOS (ROS) and start on tying it all together.

The Wyse 5060 thin client has been screwed to an old VESA TV bracket, which works OK but tends to come loose under vibration. Given I primarily want to make the rover drive around outdoors this urgently needs fixing.

The Kinect has been cut down, using information from this blog post so it can be screwed directly to the top deck and this works really well.

I threw a bit of effort at software and the sonar range finders are now an I2C peripheral of the Wemos D1 Mini that talks to the remote control and sends commands to the motor controllers. I also found an old MPU6050 accelerometer and a nice tilt compensated compass to give reliable orientation information put these on the I2C bus.

The 5060's original SSD was too small to be practical, being only 8GB. I happened to have a larger one in an unused machine and bought a cheap adaptor to make it physically fit, so the onboard computer's now running a full copy of Ubuntu and it's possible to remote desktop into it and test the Kinect 'live'. A couple of USB WiFi dongles, while not ideal connect it to my home network indoors and provide a 'hotspot' for working on it elsewhere.

With the D1 Mini connected over USB to the the 5060 it should be possible to use rosserial to pass all I2C connected sensor information and detail from the motor controllers sensors back to ROS, while accepting 'twist' commands to move. I do vaguely wonder if I should move to an ESP32 though, as the D1 mini is rather busy with synchronous tasks that could do with being done on a second core but more importantly having hardware UARTs to talk to the motor controllers would be far better than the software serial library I'm using.

I've also plugged a USB GPS dongle I had in my box of project bits into a spare port.

Which means I've close to a full 'sensor suite' for a basic rover. The Kinect won't work outdoors and the GPS won't work indoors, so I still need to come up with something to cover these complementary gaps and do a load of work on sensor fusion, but learning about this topic is the whole point of this project.

I see LIDAR and custom made wheel tick sensors built into the hoverboard motors in my future. The former will give some outdoor obstacle location for mapping and the latter help with dead reckoning indoors. Dead reckoning is not going to work on loose surfaces, but will do OK on nice indoor flat floors and be backed up by the Kinect.

As this very much got me to a solid milestone on the rover, I've christened it with a name, 'Crufty' as it's made from 'maker cruft' and intend to write up/open source large chunks of it on Github.

It was also an opportunity for a presentation at my Raspberry Pi meetup despite the total lack of Raspberry Pis in the build.



Making ESPUI a captive portal on ESP8266/ESP32

There's a really nice UI for making basic interactive web applications using the Arduino IDE on ESP microcontrollers, ESPUI. This is a real shortcut to usability for very basic "push a button a on web page and something happens" projects as it works asynchronously and does all the behind the scenes dirty work for you.

It does have one dull limitation though, it doesn't by default work as a captive portal. You can bodge a fix to this with the following steps.

First, find the line in ESPUI.c

server->onNotFound([](AsyncWebServerRequest* request) { request->send(404); });

Then comment it out and add the code

server->onNotFound([](AsyncWebServerRequest* request) { request->redirect("/")});

This means any time a request for an unexpected URL hits the Web Server on the ESP it will redirect to the root of the server. You can change this to some other URL by changing the "/".

Now this still won't get all requests to hit the ESP, you need to add the following bits into the rest of the sketch. This assumes your ESP is acting as an AP on the usual IP address 192.168.4.1, but you can see how it's easily changed.

<near the top of the sketch>
#include <DNSServer.h>
IPAddress apIP(192, 168, 4, 1);
DNSServer dnsServer;

void setup()
{
    <rest of your setup stuff>
    dnsServer.start(53, "*", apIP);
    dnsServer.setErrorReplyCode(DNSReplyCode::NoError);
}
 
void loop()
{
    <rest of your loop stuff>
    dnsServer.processNextRequest();
}

I really should submit a more complete solution as a pull request and new example to the library.

Now when most devices hit the ESP they will take you straight to the page asking you to "sign in" etc. and it doesn't stop you manually going to the page.

Quick cyberdeck build: Part 9

This is an after action report on the Cyberdeck as the LARP I put it together for was this week. It worked excellently with a couple of minor irritations.

In advance I had worried about the battery life being poor as I'd never really done a run down test and it was fitted with a cheap eBay aftermarket battery. Also, hanging the keypad, trackball and illuminated buttons off USB OTG was using a little power all the time despite using my Power Profiler to help reduce that as much as I could.

In the end it was just fine to be used heavily all day. There was a lot of in-game chat.

I had wondered about usability as modern smartphones and their apps are designed around a touch interface with really great predictive typing. 

Going back to a trackball, mouse buttons and keyboard is very much a retrograde step. Discord (which the LARP used for in game communications) did suck a bit with the non-touch interface but was usable and the case fitted nicely in both hands for two thumb typing. The screen could still be touched but this is hard near the edges due to the lip on the case.

The biggest aggro was the paint. I hate painting 3D printed objects and having tried not to spray the paint on too thick and give it plenty of time to dry the red parts were still soft as I was travelling to the game.

At various points the case, especially the back, kept sticking to other things like the tool belt I carried the Cyberdeck in. You can see this quite clearly in the second photo where it's messed up the finish. If I ever open this up again to add more features, like a LORA radio, I'll end up reprinting the case and transferring the internals to a new one which will mean ungluing some of the wiring.