Development Work

Getting Jenkins and Laravel to play together for continuous integration via github

Posted on Updated on

Well what is continuous integration? Google says this:

Continuous Integration (CI) is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early.

So I like laravel, its an elegant PHP framework with loads of great documentation and PHP has no shortage of resource all over the world, essentially it is not a new or niche language you’ll struggle to find people for or have to pay high prices for vogue specialists.

So if we are using laravel and github it makes sense to update our production website when we commit or merge back to our master branch.

Step up Jenkins, Jenkins quotes it is this:

The leading open source automation server, Jenkins provides hundreds of plugins to support building, deploying and automating any project.

So lets start with the building and deploying part for automation, I’m using ubuntu and you need to hit the command line (ssh).

1. Installing Jenkins

ssh -i /Users/{user}/Downloads/{your pem file}.pem   ubuntu@{server_ip}

sudo apt-get install default-jre

wget -q -O – | sudo apt-key add –

sudo sh -c ‘echo deb binary/ > /etc/apt/sources.list.d/jenkins.list’

sudo apt-get update

sudo apt-get install jenkins

/usr/bin/java -Djava.awt.headless=true -jar /usr/share/jenkins/jenkins.war –webroot=/var/cache/jenkins/war –httpPort=8080 –ajp13Port=-1

You should now be able to access your Jenkins install on port 8080 i.e.

If you have problems check the log


2. Installing Apache for laravel

sudo apt-get install apache2

sudo apt-add-repository ppa:ondrej/php

sudo apt-get update

sudo apt install php7.1 php7.1-xml php7.1-mbstring php7.1-mysql php7.1-json php7.1-curl php7.1-cli php7.1-common php7.1-mcrypt php7.1-gd libapache2-mod-php7.1 php7.1-zip

3. Don’t forget composer

sudo apt install composer

4. Make an ssh key for github 

ssh-keygen -t rsa -b 4096 -C "{your email}"

eval “$(ssh-agent -s)”

ssh-add /home/ubuntu/.ssh/id_rsa

Add it to GitHub via your projects keys.

5. Tell apache where your project is

sudo nano /etc/apache2/sites-available/000-default.conf

DocumentRoot /var/www/{project name}/public

Options Indexes FollowSymLinks

AllowOverride All

Require all granted

sudo systemctl restart apache2

5. Turn on mod re-write for your laravel routes

sudo a2enmod rewrite && sudo service apache2 restart

6. Clone the github project

cd /var/www/

sudo git clone{user}/{project name}.git

7. Don’t forget to crate a .env file!

sudo cp  /home/ubuntu/.env /var/www/hapori_2/.env

8. Check your laravel install it should be working now 🙂 debug it if it isn’t!

9. Configuring Jenkins with github – install the plugin

Screen Shot 2018-05-28 at 11.15.54

10. Add a project and then connect to your github project

Screen Shot 2018-05-28 at 11.17.09.png

11. Now configure your build automation

Screen Shot 2018-05-28 at 11.17.09

Here is the code I used, you should add unit tests in here somewhere and perhaps not freshly migrate and seed every deployment unless your prototyping.

sudo rm -rf /home/ubuntu/{project_name}/
sudo -u ubuntu git clone{user}/{project_name}.git /home/ubuntu/{project_name}/

sudo rm -rf /var/www/{project_name}/

sudo mv /home/ubuntu/{project_name}/ /var/www/{project_name}/
sudo cp /home/ubuntu/.env /var/www/{project_name}/.env

sudo chown -R www-data:www-data /var/www/{project_name}/
sudo chmod 755 -R /var/www/{project_name}/storage

cd /var/www/{project_name}/

sudo composer install
sudo php artisan migrate:fresh –seed


note, you might want to run your script as sudo


sudo nano /etc/sudoers



12. That is it!

You should now be able to do a commit on your master branch in github and see an auto update to your server. Look at the build info and console log to debug.



Capturing time-lapse from mjpeg streamer or any jpg using aws lambda

Posted on Updated on

We have a raspberry pi running jpeg streamer through octoprint for checking the status of a remote 3d printer, it is also a pretty unusual print and unusually large so I thought we should capture the process for posterity.

I really did not want to spin up a server to capture the 96 hour print (1 image every 60 seconds/1 minute) so I thought why not write a was lambda function (server less!), it will be much cheaper and easier – no command line crons and server to set up and ssh into etc.

Here is how I done it:

Firstly rather than the stream action, we need to use the snapshot function (action=snapshot).

Next we create an S3 bucket to save our images I called mine “timelapse-camera-1”.

Screen Shot 2018-05-24 at 22.22.21

Following that we move onto was lambda and set up our role and function. I went into the membership menu, roles and manually attached this policy.

Screen Shot 2018-05-24 at 22.22.58

I then wrote the following code in lambda, saved and tested.

BUCKET_NAME = "timelapse-camera-1"
from __future__ import print_function

import boto3
import time
import urllib

BUCKET_NAME = "timelapse-camera-1"
TMP_FILE = "/tmp/tmp.jpg"

def lambda_handler(event, context):
camera_file = urllib.URLopener()
camera_file.retrieve("http://000.000.000/webcam/?action=snapshot", TMP_FILE)

timestr = time.strftime("%Y%m%d-%H%M%S")

s3 = boto3.resource('s3')
s3.meta.client.upload_file(TMP_FILE, BUCKET_NAME, timestr + '.jpg')

return 'saved ' + timestr + '.jpg'

except Exception as e:
raise e

Next we need to run our function every minute by calling lambda from was cloud watch.

There we go, easy 🙂 just remember to turn it off when your done.

Prototyping for Sealife Centre Interpretation 

Posted on Updated on

The National Sealife Centre wanted to try something a bit different and meet a tight budget with touch screen interpretation which was cheap and very quick to update, easy to replace and looked different.

I came up with some laser cut sea life and a budget touchscreen, the laser cut model would be covered in a vinyl print for extra effect.

The touch screens are managed by a central linux based system which can be centrally managed online quickly and easily with little IT knowledge making a somewhat budget yet effective soltion.

Rapiro facial recognition with OpenCV and Raspberry Pi

Posted on Updated on

I decided to have a go at doing something fun with my Kickstarter Rapiro this evening and got some inspiration from Tony DiCola on the AdaFruit blog (

So here are the steps I took and the script I put together for my Rapiro Facial Recognition 3am hack.

1. Install open CV

sudo apt-get update

sudo apt-get install build-essential cmake pkg-config python-dev libgtk2.0-dev libgtk2.0 zlib1g-dev libpng-dev libjpeg-dev libtiff-dev libjasper-dev libavcodec-dev swig


tar zxvf opencv-2.4.7.tar.gz

cd opencv-2.4.7

note: the make command takes about 5 hours!


sudo make install

2. Install Python Dependancies

sudo apt-get install python-pip

sudo pip install picamera

sudo pip install rpio

sudo apt-get install python-serial

3. Download Tony’s base project here from GitHub note that I deleted all the negative training images or it takes an age to run the final script.

4. Take Pictures of you – type c (and press enter) to capture about 10-12 images in different orientations and expressions.

sudo python

5. Build the training file


6. Add my file to the folder

7. Try the script out

sudo python

8. If it works add it to a bash script on startup (Make sure you are in the pi folder)

cd ~
sudo nano rapiroscript
type these lines:
cd pi-facerec
sudo python (save and exit: Ctrl+X, Y, Enter)
sudo nano .bashrc Scroll down to the bottom and add the line: ./rapiroscript (save and exit: Ctrl+X, Y, Enter)

9. Finally auto login

sudo nano /etc/inittab
find: 1:2345:respawn:/sbin/getty 115200 tty1
comment out like so: #1:2345:respawn:/sbin/getty 115200 tty1
then add below this line: 1:2345:respawn:/bin/login -f pi tty1 /dev/tty1 2>&1

10. You can add commands or play around with some different ones in the script

  • #M1 – robot will move forward
  • #M2 – robot will move backward
  • #M3 – robot will turn right
  • #M4 – robot will turn left
  • #M5 – robot will raise his hand and wave the left hand. LED will become green and flashing
  • #M6 – robot will lower his left hand. LED will become Yellow
  • #M7 – robot will move both arm and contract his hands. LED will become Blue
  • #M8 – robot will wave goodbye with his left arm. LED will become RED.
  • #M9 – robot will raise its right arm and move its waist. LED will become BLUE
  • #M0 – robot will go to initial position
  • #PS00A000T010#PS00A180T010 – full head movement from side to side
  • #PS01A000T010#PS01A180T010 – Waist
  • #PS02A000T010#PS01A180T010 – r Shoulder
  • #PS03A000T010#PS03A180T010 – r Arm
  • #PS04A000T010#PS04A180T010 – r HAND
  • #PS05A000T010#PS05A180T010 – l Shoulder
  • #PS06A000T010#PS06A180T010 – l Arm
  • #PS07A000T010#PS07A180T010 – l hand
  • #PS08A000T010#PS08A180T010 – r Foot yaw
  • #PS09A000T010#PS09A180T010 – r Foot pitch
  • #PS10A000T010#PS10A180T010 – l Foot yaw
  • #PS11A000T010#PS11A180T010 – l Foot pitch

or more here


This is very useful for testing

sudo apt-get install minicom

echo "#M6" | sudo minicom -b 57600 -o -D /dev/ttyAMA0

And for viewing the PiCam on a Mac

nc -l 5001 | mplayer -fps 31 -cache 1024 - run this on your mac after installing brew then running brew install mplayer
raspivid -t 999999 -o - | nc 5001 run this command on your Pi with your Macs IP

The Coral Spawning Journal

Posted on

I’m currently prototyping a website for recording coral spawnings as an open online journal for Museums, Universities and Aquarium Hobbyists to log when their corals breed and the conditions so that we can try to analyse trends.

Data will initially be plotted on google maps and a annual journal with a quite nice interface. We have some fab long term plans for the website.

This leads on from a concept from over 2 years ( so its nice to get to this stage and it should be a very valuable resource.

Screen Shot 2014-02-11 at 14.19.50

Posted on Updated on

Back in September we hosted our very own Hackathon at the Horniman Museum and Gardens with the task of building a temperature and lighting control system called LEACS (Laboratory Environmental Aquarium Control System) that would replicate Sunlight, Moonlight and temperature from Fiji in the hope the environmental conditions would induce the spawning of corals in captivity to get a greater understanding of the biology and work towards the conservation of these threatened coral environments.

It’s basically a Ocean Simulator for aquariums.

Well since the hackathon we’ve learnt a lot and with all systems they grow, develop and mature that’s led us to LEACS V2 and us adopting the GertDuino board as part of the solution.

We’ll post some details of the hardware shortly but we’ve also been working on a website to that’s in the beta stage at the moment but we hope to share project updates and the current environmental data we are using.

So please have a look at and bookmark it to stay up to date.


In addition we’ve also developed an iPhone app we hope to expand out to share project updates and check on environmental data when your board! You can download it here.


TMC Exhibition Stand

Posted on Updated on

You may have seen this post of some micro electronics I set up, we’ll here is the whole finished deal and it look fab!

In just over a week I put together a push button screen information system which used a microprocessor plugged into a computer and suite of relays to turn lights on and off and display the relevant product details on screen as well as a customised iPad application for showing a product movie (automatically reset to begining, pause and restart options which was branded) and finally a really cool touch screen kiosk application which had a sweet of buttons on the left which made some cool ripples on touch (pond effect) and implemented some nice touch screen brochures and website links for people to browse. This was super fun and came together really nicely.