I went to a store the other week and found a cool Skeleton in their Halloween section, I immediately thought; that’s got enough room in it for some hardware… images of a spooky cool Raspberry Pi/Arduino powered Revenant inspired Cyber-Skeleton came to my mind through a UAC-like portal.
I thought it would be awesome to have a skeleton that could move its mouth, light up an LED eyeball and even take some pictures (with some ML on top) as a cool geeky and DOOM inspired Halloween prop.
Revenants in DOOM are basically cybernetically enhanced demon skeletons – you can find out more about these awesome things from here.
So depending on your preference, put on some music from Bobby Prince, Aubrey Hodges, Mick Gordon or Andrew Hulshult and David Levy and let’s dive into The Revenant!
Make a Nerf Blaster than can only be fired by users with an authorised fingerprint
Be able to get sensor data on the user and environment
Be able to remotely administer the system
I’ve had the idea now for some time to make a Metal Gear Solid 4 inspired project based on the SOP System that’s in the game – so when Digitspace asked me if I had any project ideas I told them about it and they were happy to sponsor the project and give me the parts! So thanks to them for enabling me to make this project!
So, in a nutshell the SOP System in game is a system than controls all military hardware and ensures that unauthorised users can’t utilise tools if they aren’t registered on the system.
For further explanation check out the in game cut-scene with extra details:
So with the project plan and parts from Digitspace I constructed the hardware around a Nerf Blaster and got to work!
I’ve always been fascinating by the idea of moving ones mind to a robot to live forever as a machine so I thought what better time to start the seeds of this with a nice bit of Raspberry Pi work with TensorFlow and a Muse brainwave reading Headset – starting off simple and working my way up:
Read brainwave data.
Use TensorFlow to train a model on brainwave data to determine a relaxed or a non-relaxed mindset.
Use information gathered from this as a jumping off point to see what else can be read from the mind.
A couple of years ago I made an Arduino Lightsaber that could have custom colours applied to it with a colour sensor, it was a messy build and the colours weren’t so great and since then I’ve been thinking of ways I could improve; with the release of Episode IX just recently I thought it was the perfect time to make another one!
This is essentially a miniaturised version of the type of heat sink linked above; with the significantly more powerful Raspberry Pi 4 the heat output is increased quite a bit, so it is now necessary to use such a device to really keep the temps down.
So let’s take a look at this part in terms of assembly, aesthetics and most importantly – temperature.
If you haven’t seen Avengers Endgame or Spider-Man Far From Home – You don’t want any part of this!
Final warning – possible spoilers ahoy!
Uneasy lies the head that wears the crown.
So in Far From Home Peter gets the torch passed onto him from Stark via some super AR glasses that allow him to see real time data and call in all sorts of super Stark Industries gear.
As soon as I saw these in the film I knew I needed to do a project on them so as soon as possible I got my hands on the glasses from eBay and got to work on plans and parts.
It’s storing energy from the earth’s core!
Make aesthetically accurate EDITH Glasses from Far From Home/IW.
Make it display to the users eye and have speech rec capability.
I thought I should give this it’s own specific post – this is the chatbot that I’ve used in my Raspbinator and Nvidinator projects. The GitHub linked below will be updated over time as I make improvements to it.
I’ve decided on the name Chatbot 8 – as before I used GitHub I had it on my Google Drive and each iteration I increased the number; the first one I was happy to use in the Raspbinator was iteration 8 and now, the name has kind of stuck.
To make a bot that can respond to human input, learn and return more organic responses over time.
To be able to be trained from large text files such as scripts for movies and transcripts of conversations.
Have it able to be integrated easily into other projects.
In January 2018 I finished a project I had been working on for quite some time – The Raspbinator; I was very happy with it and it got some good attention. But there were some bugs and limitations to it and I already ideas for the next one.
Early in 2019 the Nvidia Jetson Nano was released and it had great capability for running machine learning; I thought this would be perfect for the next version of my project and a great opportunity to get into ML/Neural Nets.