After watching a recent youtube video about MicroMouse, I keep thinking about building my own and using it as a way to explore and learn about different kinds of code than the things I typically write for my job.
Is anyone else interested in this?
Update: People are interested, and we’re making a little community here
This thread ended up mostly focusing around controlling the Pimoroni Trilobot, but feel free to tag future threads/questions with toy_robot and I’ll try to link to resources from this first post:
Those were the best 25 minutes I spent. I just got started with embedded development working through “Build a Binary Clock with Elixir and Nerves” 2 days ago. I’d love to participate in this, though I’ll have a lot of learning to do.
I would also be interested! Haow do you think we should collaborate on this?
I don’t imagine we’ll have enough people all in one-place to do face-to-face meetups, but I was thinking we could start finding kits or parts to build our own “mice” (something that can fit a raspberry pi zero 2?) and share our progress as we go?
Maybe some folks from the Nerves Meetup would be interested as well? @amclain
I started working on a cad model based on some parts I can order from The Pi Hut Onshape
could be fun - this kit might be useful:
also using esp32 hw could be possible with GitHub - atomvm/AtomVM: Tiny Erlang VM
I haven’t looked at the AtomVM project recently. It’s really cool that they’ve added support for the rpi pico. Maybe we should give it a try!
This sounds good! Timewise i wouldn’t be able to make face-to-face meetups, but sharing progress as we go is what i am looking for
Same for me. I’m living in London and can’t attend the evening meetups in the US. So posting progress online will work well for me. Let’s just start here in this thread for anyone interested and see if it gains momentum.
I think I’ll start with that Trilobot Base Kit posted above, but use a pi zero 2 (I have one on hand)
It looks like this sensor might be an interesting option to swap out for the ultrasonic sensor that comes with the trilobot SparkFun Qwiic dToF Imager - TMF8821
There is already an elixir library written to interface with this sensor and it gives back measurements from 1cm - 5m in a 4x4 grid GitHub - pkinney/tmf882x
If you can send me the elevator pitch and where people should go to join the project, I’m happy to share it. If you’re able to record a short video, that would be more impactful than text.
I started a github repo where I’ll keep my code for the project GitHub - mmmries/trilobot: An Elixir Nerves project for controlling a Pimoroni Trilobot
I went ahead and ordered the Trilobot kit and TMF8821 board which can do time-of-flight distance measurments in an 8x8 grid across ~45° which I’m hoping will give me a lot more data about what’s in front of the bot.
My plan is to start with just getting trilobot assembled, get nerves running on the raspberry pi and reverse-engineer the python library well enough to control the motors and built-in distance sensor.
I also started a cad model of the Trilobot. I don’t know if this will actually be useful to me or anyone else, but I’m trying to grow my cad skills
I’ve also ordered the Tribot and the board you mentioned yesterday. Hope the delivery will not take too long
Mine arrived quite quickly! I’m in the UK so it was a short shipping distance, but hopefully yours arrives soon too
Looks cool! And it looks quite big … At least it looks bigger than on the images
Got my shipment notification as well
Using that sonic sensor is not interesting, you can do that with a MCU, maybe upgrade to a webcam and do some image processing?
I’m planning to start with the ultrasonic sensor just to get some basic things working and then switch to add on or switch to the TMF8821 which does time-of-flight distance measurements in an 8x8 grid.
I would love to get some kind of SLAM working at some point, but I’m new to this area so trying to take baby steps as I go
It would be interesting to do the BEAM version of micromouse which would be something like swarm racing.
Five cars networked that have to solve the maze in the first run. Something like that.
I love that idea. Kind of like parallel SLAM optimization where the sensors and processing of the map are done in parallel?