Kurisu Xmas Stream - a look behind the scenes

Last year, I've had an idea to buy a small cypress tree, hang some RGB LEDs and make them controllable from the interwebs. Because of unrelated problems, I didn't manage to finish the project then, so I tried again this December. In this article I'll take a look into the more technical side of things behind the Kurisu Xmas Stream. I highly recommend you check out the stream before reading the rest of the article. EDIT: The stream went down. It's not the same experience, but you can watch a short part of it saved on YouTube.

Stripping those without a wire stripping tool was a real pain...

The Blinkenlights

Firstly, I took care of the LED side of things. After ensuring a working electrical solution (basically a lot of cables - as seen on the picture above - and a resistor connected to the Raspberry Pi's GPIO header), it was time to code something that would actually turn on the lights. On the server side, there's `leds.php` which takes the user input, checks if the user is a spambot or not, and if all the checks pass, the script dumps all the LEDs that are meant to be lit into a file. I'm not massively proud of this code as it got a lot of hotfixes, but on the other side there's this cool bash script that parses the output on the other side..!

#!/bin/bash R1=24;G1=25;B1=28 R2=30;G2=21;B2=29 R3=31;G3=10;B3=6 R4=27;G4=22;B4=23 R5=11;G5=14;B5=13 while true; do all='29 28 27 25 24 23 22 21 30 31 11 10 6 13 14' for i in $(curl URL); do j=${!i} all=`echo $all | sed -s "s/$j//g"` gpio write ${!i} 0 done for i in $all; do gpio write $i 1 done sleep 3 done

This rather lengthy (for what it does) script handles everything on the Raspberry side of things. Firstly, it sets hardcoded values for the LED pins, then it downloads data from my VPS via curl, and just after parsing it and making sure to prevent unnecessary LED flashes, it calls the `gpio` command (from the wiringpi library) and sets the output. Because my RGB LEDs are common anode (they share the positive lead), writing `1` to the output actually turns them off. Rest of the script should be self-explanatory.

The Stream

At first, the plan was to stream directly from the Raspberry Pi that controlled the LEDs, using the Pi Camera. Unfortunately, my picam died while testing the stream last year, probably due to improper handling and ESD (static electricity). Surprisingly, last moments of that PiCam were captured on a test stream. My next idea was to use a webcam that I took out of a ThinkPad X220 - it was small, it had good enough video quality, but most importantly - it used USB, so connecting it to a raspi was a fairly easy task - I've only had to add a 3.3V voltage regulator, and an USB plug and I was good to go!

The real problem started when I actually wanted to start streaming. After modifying my previous ffmpeg command to accomodate for the different video device, it quickly became apparent that my Raspberry Pi 2B just wasn't really fit for the task of transcoding a HD video stream - the Pi was nowhere near 25 frames per second. I could just settle for a lower amount (for a stream like this even 1fps would suffice), but YouTube's system kept complaining about stream's health. Back to the drawing board!

The WIP alpha² physical layout - I later replaced the routerboard box with something that wouldn't randomly slide away

New idea: use dedicated hardware for stream transcoding. For this year's Google Code-In, I've had a task for which I wrote a guide about installing Haiku on VMware ESXi. This left me with a laptopserver (basically a ThinkPad X230 placed vertically) that didn't do much besides hosting a Haiku VM for me to connect to over VNC. I quickly set up a Linux VM on there, following that with setting up the PCI pass-through for USB controllers. As the webcam I planned to use before gave me even more problems, I've installed a random MJPEG streaming app on my old android phone and got to work setting up OBS. Below are some stats of the usb tethering interface used to transfer the MJPEG data from my phone to the VM as seen during the writing of this post - that screenshot says something about the RNDIS2 tethering stability as compared to MTP or other Android file transfer solutions..

(surprisingly) THIS IS FINE

Quirk(s)

The audio part of the stream is actually generated on my VPS using a few bash scripts, ffmpeg and ffserver - it's a bit complicated, but the end user gets two streams, one of which was fed into the OBS. At first, it seemed like it would just work, but I quickly realised that because my ISP is terrible, the audio would drop out after around two to six hours. The fix is dumb, and I'm surprised that it even works...

#!/bin/bash while true; do obs --startstreaming & sleep 7200 killall -9 obs sleep 1 done

I've had a lot of fun preparing the hardware and software for this little silly thing! Provided that it will still work, the stream will last until early January. EDIT: The stream died. I plan to release the rest of the code behind it shortly thereafter.

Until then, I wish you a Merry Christmass and a Happy New Year!


Support me on ko-fi!

Comments:

By commenting, you agree for the session cookie to be stored on your device ;p