Hackspace access control

I'm a trustee at East Essex Hackspace which is currently discussing buying a fibre laser and it's rekindled the desire for 'toolbots' to control access to things. Originally I was looking at this over a year ago but got busy with other things so shelved it and then the need suddenly seemed less urgent.

One of our challenges is that when we opened we used some Wiegand RFID readers for our door entry system and fed the UIDs they gave into our membership database. Later on we found out that these UIDs weren't the same as the UIDs other things reported. I did some fiddling around and realised a simple XOR and byte reshuffle that fixed this so we were back in business until I built something using this and found it wasn't right for some issued cards. 

Then the access control project got shelved.

Having dug out my access control mockup again I armed myself with a pile of cards that I know don't map IDs how I thought and quickly realised if they have a 7-byte UID then the byte order reshuffle is different than for a 4-byte UID. So we're potentially back in business.

For the hardware I've taken a bit of an executive decision on the basis that whoever does the work gets to say how it's done, otherwise endless bikeshedding occurs. The hardware will be based on an Olimex ESP32-EVB which is an ESP32 dev board with Ethernet, onboard relays and a bunch of hardware we probably don't need. They're reliably available and properly certified.

To add the RFID reader I've designed a little 'shield' that plugs into the UEXT connector on the EVB which I'll solder some cheap MRC522 boards to.

It also breaks out some GPIO and has LEDs and somewhere to connect a sounder. I'm not even sure if we'll use a sounder, but I just wanted to get the board ordered.

Work on the software has been ongoing for ages but I've started poking at it again and one of the other Trustees already did a bunch of stuff in our membership database so I'm hoping this project can get done without too much pain.

Each tool will have different ways of limiting its use but my expectation most will already have an interlock or emergency stop that can be connected to one of the onboard relays of the EVB. I'd not expect these devices to control the main power feed to the tool unless it's a very small one.

Hoverboard comms library

I'm back making silly moving things using hoverboard hub motors and re-flashed hoverboard controllers.

For months on end I've been vaguely putting off working on the big 6-wheel platform I started because I needed to write a library for the comms protocol so I wasn't committing the sin of copy & pasting great swathes of code to run the three separate controllers.

Now I've finally sat down and got round to it so the excuses for not progressing the big rover project are evaporating. I only started thinking about this two years ago.

MU|TH|UR setup script

I have now (mostly) documented/automated how to create a text-to-speech bot as recently used at our High Frontier LARP.

Instructions are here.

retroTerm release 0.1.6

At our recent LARP High Frontier: The Drake Objective I used retroTerm my ANSI/VT terminal GUI widgets library to create a prop where the players interacted with an offscreen extra-terrestrial intelligence through a purposefully retro computer interface.

This is the whole reason the library got written in the first place.

Work started on retroTerm in maybe 2019. Then the game it was scheduled to be used in got postponed due to the pandemic and despite the extra time that afforded me I didn't manage to deliver anything usable for the game when it finally happened.

It was a terrible case of overpromise and underdeliver, ie. I delivered nothing usable. The work on the retroTerm library itself was fine but the messaging system it was to provide the user interface for just wasn't stable and we abandoned it at the last minute.

When a short 'midqual' LARP event came up last year I rewrote all the messaging code and produced a version of the thing with a cut-down retroTerm user interface that saw a tiny amount of use. There were still some frustrating problems though: centred on it needing an accurate timestamp for all the messages and me getting caught out by GPS just not working for this. I had not realised the building it would be used in had a metal roof and the dubious signal ended up causing it to end up with spurious date & time values, some in 2080. Which is perhaps a failing of the GPS library I used to process the NMEA sentences but regardless this was not good and caused it to misbehave.

As a result when The Drake Objective was announced I made some new hardware that included a hardware real-time-clock and did a ton of work on making damn sure the setup would have a correct time. It couldn't just connect to WiFi and get it from the Internet as it'd be used in a building in a field with no services and also very dubious 4G signal we weren't sure would be usable.

The new hardware worked around the issues well and the whole prop behaved pretty well during a weekend long game albeit in its still quite cut down form.

Part of putting this thing together saw me make some improvements to retroTerm.

I found and squashed a memory leak[1] and have just improved the way it handles 'list box' widgets so I feel it's in a pretty solid spot. Today I fiddled around testing it on an ESP32 over Classic Bluetooth Serial rather than a wired connection and this worked trivially easily.

This is something I can imagine use cases for so I'm glad I finally tried it out. Often ESP32 projects get built with little captive web portals to configure/control them but this gets 'big' quickly. Being able to run a wireless retroTerm interface for this is something I'm going to try in an upcoming project.

Anyway this was all a big ramble to say retroTerm 0.1.6 is out and seems to be working well.

Next on my list is making that multi-user peer-to-peer messaging system I originally promised back in 2019 happen. Even if we never actually use it I want to have done it and to that end have been writing great swathes of code in preparation already which I've been sticking in some new libraries. When they're more finished I'll share them.

[1] Which only showed up if constantly updating something like a clock widget


You now have fifteen minutes to reach minimum safe distance

We have just run a sci-fi LARP where we wanted to have the classic trope of a computer with a sad voice telling the players bad news like MU|TH|UR in Aliens, GERTY from Moon, HAL in 2001 etc. etc. It's a very common part of the aesthetic of a certain kind of sci-fi.

Our game was set in the Aliens universe so we wanted to emulate the female voice from MU|TH|UR. None of the GMs are voice actors (or female) so I made a text to speech device for this.

I got quite far down the rabbit hole of power saving and solar charging on a Raspberry Pi 3A+ then somebody offered to bring a massive lead-acid battery to the game for me so I abandoned worrying about all that because brute force won out. I will come back to it though as I want to make a nice portable outdoor Pi to provide 'infrastructure' at future events. Having infrastructure I can deploy easily in a field with no power has been a constant thing I've been nibbling at the edges of for years mostly with ESP32s and mesh networking.

I put the Raspberry Pi 3A+ into a waterproof box I had and powered it off that big battery all weekend. I even forgot to shut it down on Saturday night and it jut kept going.

The Raspberry Pi communicated with the Internet over 4G using a dongle I had kicking around which conveniently has a tiny NAT router in and presents as a USB ethernet so 'just works' with zero configuration on the Pi, or pretty much anything else. I'll probably try find another dongle the same, it's handy for these reasons.

I then made a 'bot' in the popular messaging app Telegram using their example Python script and used the Microsoft Azure AI voice engine to generate a very convincing voice. This was not a very sophisticated script, really just Telegram's example bot script, their recipe for restricting access to certain Telegram IDs and Microsoft's example text-to-speech script all smushed together with my rudimentary understand of Python as I'm normally an Arduino/ESP32 C++ person.

None of this needs much processing power: an original Pi could probably do it, except the Microsoft library expects a 64-bit OS so the Raspberry Pi 3A+ was a good fit. It's the lowliest, most power efficient 64-bit Pi that also has an analogue audio out and I had one in my stash of dormant SBCs.

The end result was we could type a message on our phone into a Telegram group chat and it would then ring out across the game using walkie-talkies to broadcast it over a large area. There was a little bit of impedence/level matching needed for the line out from the Pi to go into the headset input of the walkie-talkie. I used voice activation in lieu of 'push-to-talk' but that too worked great once I added a preamble 'bong' and little gap before the talking. I may go back and investigate triggering the push-to-talk through the headset connector using one of the Pi GPIOs as it's more predictable and the headset has the feature: I was just bashing the thing together quickly for the game.

"You now have fifteen minutes to reach minimum safe distance"

If players wanted to talk to MU|TH|UR they would just talk on the same walkie-talkie channel and we were listening. We had some quite long player conversations back and forth with MU|TH|UR like this and it worked brilliantly. We'll be using this setup again.

m2mDirect v0.1.1 release

A long time ago I created m2mMesh, which was an Arduino library for a self-organising mesh network onESP8266/32 that allowed you to do machine-to-machine messaging.

This has been quite useful but sometimes it's overly heavyweight so I've written a simpler thing for direct links between two devices. Also with just two devices you can use the built in encryption features of ESP-NOW. Lack of encryption was always a concern with m2mMesh.

This new library is m2mDirect, which I intend to use for control/telemetry of the various rovers I've been working on.

It's not in the Arduino Library Manager yet as it needs more work, in particular ESP8266 support and channel negotiation, but it is now available on GitHub.

Obscure Arduino tips #5

Just today I spotted an excellent 'ease of use' thing in the Arduino IDE that Sparkfun had used in one of their examples. Maybe consider adding it to any code you are going to share.

There is a pseudo-URL you can add as a comment in code that will link to the Arduino Library Manager and search for a library.

For example

#include "SparkFun_TMF882X_Library.h" //http://librarymanager/All#SparkFun_Qwiic_TMF882X

This way if somebody has your code they can click on the link and be taken straight to an install option within the Arduino IDE.

This is a really nice little usability feature I had no idea existed.

I'll gloss over the fact Sparkfun originally typo-ed the link in the sketch in question. :-)

Trackable Lasertag sensor

As a partner piece to the PDT Tracker I needed to make some 'wearables' to go with it.

My original plan had been to take some of the PCBs made for the PDT Tracker and jury rig them into that wearable.

I did this in May but in the end the software wasn't ready for the event I needed them at and even had it been the game overran and we didn't get to the point where they were necessary.

So I've had a couple of months to come up with a wearable beacon and I decided to go for an MVP of my Lasertag 'holy grail' idea: a  Lasertag sensor that is remotely trackable and sends status updates.

I consider it a 'minimum viable product' because not only is the software a first attempt it relies on external modules for some of its features.

All the PCB has on it is an ESP32-C3 WROOM module, RFM95W LoRa module, LDO voltage regulator, a few passive components and solder headers for various things.

The GPS will always be a bought in module but relying on an external USB breakout, LiPo charger and sounder makes it bulkier than it otherwise might have been. I also put the components all on one side for easy soldering which gives it a fairly large 'footprint'.

Initial desk based testing with the board you can see above showed everything to work so I designed a quite utilitarian 3D printed enclosure and assembled four headband style sensors to test.

Convention in UKLTA is that sensors are worn on the head and this prototype ended up slightly bulkier than the commonly used sensors but with a LiPo inside rather than 3x AAA batteries it was no heavier.

Getting everything inside the case was fairly easy but construction of the headband which includes four IR sensors and four 3mm LEDs was quite tiresome. I'd opted to make little 3D printed enclosures for these and this really exacerbated struggles with getting the wiring done tidily.

In the end this all worked so I'm going to improve on the software between now and our next event in May 2024. I might do a second revision of the PCB with more functionality on-board but without a clear requirement for the system yet the four prototypes I have work and can be used to further test the concept.

One of the things that didn't get tested is haptic feedback, mostly because I was rushing and I didn't have a nice compact vibration motor to use. It's planned though as one of our members is hard-of-hearing and also I really like the haptic feedback in the Laserwar equipment I use at another LARP.

So far this post has been all about the process and the componentry but here's the concept I'm trying to realise.

  • Every participant in a LARP is location tracked, players, crew and 'monsters'
  • The 'health' of every participant in the LARP is tracked
Obviously this only makes sense in an outdoor LARP with an element of tactical combat but that's what I play.

The purpose of this is essentially to allow us to replicate the sort of thing you see in action sci-fi media where there are one or both or some combo of the following two things.
  • A 'tracker' that shows the relative location of friends/foes to players eg. the Alien/Aliens motion tracker
  • A 'squad status' system for squad leaders eg. the 'command desk' in the APC in Aliens that shows the 'health' of the squad.

This is all very mil-sim industrial sci-fi stuff but that's what I want from my games. In principle it should also be able to link the sensor to my prototype Lasertag weapon board using Bluetooth and show when a participant is firing, out of ammo etc.

I would also love to partner it with my HelmetCam prototype for the full Colonial Marines experience but the challenges of WiFi outdoors may make that impractical.

These ideas are taking a long time to come to fruition but I am slowly edging towards them. I bought the GPS modules used in this back in 2020.