So I’m working on a one pixel camera project with the wee lads. A ‘photo’ from a one pixel camera looks odd. It makes images of time, rather than space. Here is a ‘photo’ of an hour I spent working on some arduino stuff just now.


Each pixel represents one second. Each row represents one minute. The 60 x 60 block represents an hour. What can you tell from this photo? Well, you can tell that I probably wasn’t outside from the lack of blues or greens. Something orange turned up late in the hour. Probably a wee lad in an orange fleece. Something red popped up from time to time. Probably my notebook.

The camera for this little test was just the camera on my mac. I used Processing to grab a pixel from the centre of the frame once a second and plot it to a grid.

The next version uses Python, Raspberry Pi, the RPi Camera, a custom Lego Pi case for directing the camera. It’s using phant for data logging so that I can throw the pixels at a server, and then process them into images or animations elsewhere. I’m mainly experimenting with exposure settings to get a balance of pleasing / accurate brightness through a 24 hour cycle.

Testing takes time when you’re taking photos of time.