The National Sealife Centre wanted to try something a bit different and meet a tight budget with touch screen interpretation which was cheap and very quick to update, easy to replace and looked different.
I came up with some laser cut sea life and a budget touchscreen, the laser cut model would be covered in a vinyl print for extra effect.
The touch screens are managed by a central linux based system which can be centrally managed online quickly and easily with little IT knowledge making a somewhat budget yet effective soltion.
I decided to have a go at doing something fun with my Kickstarter Rapiro this evening and got some inspiration from Tony DiCola on the AdaFruit blog (http://learn.adafruit.com/raspberry-pi-face-recognition-treasure-box?view=all).
So here are the steps I took and the script I put together for my Rapiro Facial Recognition 3am hack.
1. Install open CV
sudo apt-get update
sudo apt-get install build-essential cmake pkg-config python-dev libgtk2.0-dev libgtk2.0 zlib1g-dev libpng-dev libjpeg-dev libtiff-dev libjasper-dev libavcodec-dev swig
tar zxvf opencv-2.4.7.tar.gz
cmake -DCMAKE_BUILD_TYPE=RELEASE -DCMAKE_INSTALL_PREFIX=/usr/local -DBUILD_PERF_TESTS=OFF -DBUILD_opencv_gpu=OFF -DBUILD_opencv_ocl=OFF
note: the make command takes about 5 hours!
sudo make install
2. Install Python Dependancies
sudo apt-get install python-pip
sudo pip install picamera
sudo pip install rpio
sudo apt-get install python-serial
3. Download Tony’s base project here from GitHub note that I deleted all the negative training images or it takes an age to run the final script.
4. Take Pictures of you – type c (and press enter) to capture about 10-12 images in different orientations and expressions.
sudo python capture-positives.py
5. Build the training file
6. Add my rapiro.py file to the folder
7. Try the script out
sudo python rapiro.py
8. If it works add it to a bash script on startup (Make sure you are in the pi folder)
sudo nano rapiroscript
type these lines:
sudo python rapiro.py (save and exit: Ctrl+X, Y, Enter)
sudo nano .bashrc Scroll down to the bottom and add the line: ./rapiroscript (save and exit: Ctrl+X, Y, Enter)
9. Finally auto login
sudo nano /etc/inittab
1:2345:respawn:/sbin/getty 115200 tty1
comment out like so:
#1:2345:respawn:/sbin/getty 115200 tty1
then add below this line:
1:2345:respawn:/bin/login -f pi tty1 /dev/tty1 2>&1
10. You can add commands or play around with some different ones in the rapiri.py script
- #M1 – robot will move forward
- #M2 – robot will move backward
- #M3 – robot will turn right
- #M4 – robot will turn left
- #M5 – robot will raise his hand and wave the left hand. LED will become green and flashing
- #M6 – robot will lower his left hand. LED will become Yellow
- #M7 – robot will move both arm and contract his hands. LED will become Blue
- #M8 – robot will wave goodbye with his left arm. LED will become RED.
- #M9 – robot will raise its right arm and move its waist. LED will become BLUE
- #M0 – robot will go to initial position
- #PS00A000T010#PS00A180T010 – full head movement from side to side
- #PS01A000T010#PS01A180T010 – Waist
- #PS02A000T010#PS01A180T010 – r Shoulder
- #PS03A000T010#PS03A180T010 – r Arm
- #PS04A000T010#PS04A180T010 – r HAND
- #PS05A000T010#PS05A180T010 – l Shoulder
- #PS06A000T010#PS06A180T010 – l Arm
- #PS07A000T010#PS07A180T010 – l hand
- #PS08A000T010#PS08A180T010 – r Foot yaw
- #PS09A000T010#PS09A180T010 – r Foot pitch
- #PS10A000T010#PS10A180T010 – l Foot yaw
- #PS11A000T010#PS11A180T010 – l Foot pitch
or more here http://forum.rapiro.com/thread/21/
This is very useful for testing
sudo apt-get install minicom
echo "#M6" | sudo minicom -b 57600 -o -D /dev/ttyAMA0
And for viewing the PiCam on a Mac
nc -l 5001 | mplayer -fps 31 -cache 1024 - run this on your mac after installing brew then running
raspivid -t 999999 -o - | nc 192.168.1.9 5001 run this command on your Pi with your Macs IP
I’m currently prototyping a website for recording coral spawnings as an open online journal for Museums, Universities and Aquarium Hobbyists to log when their corals breed and the conditions so that we can try to analyse trends.
Data will initially be plotted on google maps and a annual journal with a quite nice interface. We have some fab long term plans for the website.
This leads on from a concept from over 2 years (https://garygfletcher.wordpress.com/2012/12/01/reef-conservation-uk-using-citizen-science-to-journal-coral-spawning-in-hobbyist-and-public-aquaria/) so its nice to get to this stage and it should be a very valuable resource.
Back in September we hosted our very own Hackathon at the Horniman Museum and Gardens with the task of building a temperature and lighting control system called LEACS (Laboratory Environmental Aquarium Control System) that would replicate Sunlight, Moonlight and temperature from Fiji in the hope the environmental conditions would induce the spawning of corals in captivity to get a greater understanding of the biology and work towards the conservation of these threatened coral environments.
It’s basically a Ocean Simulator for aquariums.
Well since the hackathon we’ve learnt a lot and with all systems they grow, develop and mature that’s led us to LEACS V2 and us adopting the GertDuino board as part of the solution.
We’ll post some details of the hardware shortly but we’ve also been working on a website to that’s in the beta stage at the moment but we hope to share project updates and the current environmental data we are using.
So please have a look at www.LEACS.net and bookmark it to stay up to date.
In addition we’ve also developed an iPhone app we hope to expand out to share project updates and check on environmental data when your board! You can download it here.
You may have seen this post of some micro electronics I set up, we’ll here is the whole finished deal and it look fab!
In just over a week I put together a push button screen information system which used a microprocessor plugged into a computer and suite of relays to turn lights on and off and display the relevant product details on screen as well as a customised iPad application for showing a product movie (automatically reset to begining, pause and restart options which was branded) and finally a really cool touch screen kiosk application which had a sweet of buttons on the left which made some cool ripples on touch (pond effect) and implemented some nice touch screen brochures and website links for people to browse. This was super fun and came together really nicely.
You’ll no doubt find this on the web via the numerous Hackspaces which were involved in this project, but here is my copy – the report shows the amazing technology and collaboration that presented the project with some amazing results as summarised in the document.
- Long term, deep water, satellite connected ocean monitoring system (raspberrypi.org)
- Blog post on the Chagos Trust website, from my expedition trip (garygfletcher.wordpress.com)
I was asked if it was feasible to automatically size aquatic animals using stereo vision. After doing some feasibility work, it quickly became a homework project out of hours. Stereo sizing has been done usually using disparity maps and blobs, this is a slightly different approach and its very exciting, with some amazing work taking place between me and a colleague in Canada – thanks for your help Doug!