MJPEG Streams, part 1


raspberry pi camera mjpeg

Part 1


One of the side projects I'm currently working on is a photo booth.

Not the kind of photo booth that you sit in to take a passport photo, but the kind that are set up at events like weddings or corporate events.

People can stand in front of it, take a picture and either get a print to take home, or can download their group shot immediately.

A brief bit of history

To cut a long story short, there are businesses that offer this service for a cost. But it isn't cheap (~£500 a day), and often requires an attendant to be present throughout the hire. I wanted something for less cost, that could be operated entirely by the folks wanting to take a picture. So I decided to build my own.

The tech

I've always tinkered with Raspberry Pi's, and through some kind of happy coincidence, in 2020 the Raspberry Pi HQ Camera module was released. This allowed you to pick from a variety of lenses, and offered a much higher quality 12MP camera that was capable of taking some really nice shots.

So with a bit of hacking together of Python, some Go and Typescript I'd put together a fairly rough-and-ready implementation of a Raspberry-Pi powered photo booth that worked really well.


Fast forward a little bit, and Arducam released a 16MP camera module which didn't require a dedicated lens and was a much smaller form-factor but still offered excellent picture quality. I picked one of these up and at the same time decided to rework some of the code that powered the photo booth.

A live preview

One of the important features of the photo booth is for the users to be able to see a live preview of themselves in the frame. It wouldn't be much good trying to take a picture of a group without knowing if you're even in the frame!

This is possible by using the camera connected to the Raspberry Pi in video mode. Though there's various different encodings you can choose to get that video stream. At the time of writing, there are four:

  • h264
  • mjpeg
  • yuv420
  • libav

h264 is the default, but there's a bit of context that needs explaining here. The video stream needs to be displayed by a web browser, as that is where the user interface for the entire photo booth lives. Out of these options, mjpeg is natively supported by Chrome (the browser I use for displaying the frontend) whereas the others are not without jumping through some hoops.

Sending the mjpeg to the browser

Unfortunately, MJPEG doesn't really follow a standard. In the case of the Raspberry Pi libcamera-vid program, it writes concatenated JPEG files to the output stream. There's no headers, or boundary markers outside of the regular JPEG data format.

So, in order to construct an MJPEG stream that the browser can understand, I needed to take each JPEG image returned on that stream, construct a HTTP response with the Content-Type header value of multipart/x-mixed-replace; boundary=FRAME and then continually stream those individual images to the client in the form:

Content-Type: image/jpeg


This content could then be placed in an HTML img tag, and the live stream preview would work without any fooling around.

The solution

This is already a long blog post, so I'm going to split it up.

The next blog will talk about how I actually took that MJPEG stream and built a HTTP compliant response that can be displayed in an img tag. Including some of the gory details about the JPEG image file format itself.