The Raspberry Beer'o'meter

If you're a regular reader, you already know we connected our BeerTender to Warp 10™, using LoRa network and MQTT plugin. And if it's the first time that you hear about Warp 10, here is a short video to know more about our open source time series database: watch the video.

That was a nice challenge, but the raspberry 3 in our server cabinet is really underloaded. And reading the beer level needs to start a computer or smartphone, connect to local wifi, then launch a WarpScript™. Not very user-friendly, and that leads to catastrophic empty beer barrel situations. Raising a slack alert is possible. But this alert might be ignored. We need a true IoT dashboard with a screen, close to the BeerTender!

Screen for Raspberry Pi 3: There are a lot of them. I bought a cheap 3.5 Inch 480x320 screen.

Raspberry screen

It is time to do a nice dashboard... I need beer level, beer temperature, LoRa alert if there is a network failure. Easy to sketch!

requirements
The design intention

Building a timeseries dashboard with Raspberry Pi?

The Web expert way:

  • Our web expert answer: no problem, I will do this in a webapp, with npm, webpack, babel, react, packaged in electron with a nodeJS backend!
  • Me: OK, so you need X.Org, maybe a window manager + 300MB of free RAM, to launch a 100MB binary... To display a beerlevel.
  • Our web expert: But with CSS, I can draw animated bubbles in the barrel!
  • Me: Why not a WebGL shader...
  • Our web expert: Great! Good Idea!
  • Me : ...

My way:

When I was working for automotive, I was used to screenshot the framebuffer on high end TFT dashboards. Because there was no graphic layer at all.

Maybe Warp 10 could draw Processing images in the Raspberry Pi framebuffer directly?

Warp 10 is shipped with a big Processing function subset, everything is ready to handle and generate images. I just need a way to display them on the tiny screen.

Prepare the Raspberry Pi

Installation is straightforward: connect the screen, compile and install the LCD driver.

# git clone https://github.com/goodtft/LCD-show.git
# chmod -R 755 LCD-show/
# cd LCD-show/
# ./MHS35-show 

Reboot. The screen now shows console messages:

Raspberry 3.5 inch ready!

That's nice, I just have to remove the blinking cursor: add vt.global_cursor_default=0 to /boot/cmdline.txt

Screenshot of the framebuffer is straightforward: cat /dev/fb0 > screenshot.raw

Conversion to PNG is a bit tricky, but could be done with ffmpeg: ffmpeg -vcodec rawvideo -pix_fmt rgb32 -s 480x320 -i screenshot.raw screenshot.png

Playing with the framebuffer

The framebuffer is a raw representation of the screen pixels. 32 bits per pixels, coded in "BGRA" (endianness...). It means that for a 480x320 screen, you need 480x320x4=614400 bytes. That is exactly the size of my screenshot.raw file.

The first 4 bytes describe the color of the top left corner, and so on. Here is a few examples to fully understand how this works:

Write a black line

for i in {1..480}; do printf '\x00\x00\x00\xff' ; done > /dev/fb0

Write colors in the first 3 pixels

printf '\xff\x00\x00\xff\x00\xff\x00\xff\x00\x00\xff\xff' > /dev/fb0

The first pixel should be blue (BGRA = 0xFF0000FF), second one green, last one red:

magnified screen (thank you good old canon g10)
magnified screenshot

That's nice. Warp 10 can write directly in /dev/fb0 too... I just need to add the warp10 user in the video group:

# adduser warp10 video

Processing2Framebuffer

It is time to develop a Warp 10 extension to do the main job:

  • Read processing image pixels
  • Convert processing pixels (ARGB) to framebuffer pixels (BGRA)
  • Write into /dev/fb0

The result is a straightforward ~30 lines function :

public class PtoFramebuffer extends NamedWarpScriptFunction implements WarpScriptStackFunction {
  
  public PtoFramebuffer(String name) {
    super(name);
  }
  
  @Override
  public Object apply(WarpScriptStack stack) throws WarpScriptException {
    Object fbpath = stack.pop();
    Object pimage = stack.pop();
    
    if (pimage instanceof PGraphics && fbpath instanceof String) {
      PGraphics pg = (PGraphics) pimage;
      pg.loadPixels();
      ByteBuffer bytes = ByteBuffer.allocate(pg.width * pg.height * 4); // 4 bytes per pixel
      for (int pixel : pg.pixels) {
        bytes.put((byte) (pixel & 0xFF)); //blue
        bytes.put((byte) ((pixel & 0xFF00) >> 8)); //green
        bytes.put((byte) ((pixel & 0xFF0000) >> 16)); //red
        bytes.put((byte) 0);
      }
      File file = new File((String) fbpath);
      try {
        Files.write(file.toPath(), bytes.array(), StandardOpenOption.CREATE_NEW);
      } catch (IOException e) {
        throw new WarpScriptException("Cannot write file " + file.toString());
      }
    } else {
      throw new WarpScriptException(getName() + " expects a STRING to specify the frame buffer path on top of the stack.");
    }
    
    return stack;
  }
}

The full code is available here.

The first WarpScript:

// @endpoint http://raspberry:8080/api/v0/exec
// @preview image
480 320 '2D3' PGraphics
0xff Pbackground               // white background
  'data:image/png;base64,iVBORw0KGgoAAA... very long string!  ...TkSuQmCC'
Pdecode 90 10 Pimage           // decode the warp10 logo from a base64 png
'CENTER' PtextAlign            // center text
0xffff0000 Pfill               // red text
30 PtextSize                   // 30 px font size
'+ Frame buffer extension' 240 190 Ptext
0xff00c800 Pfill               // green text
40 PtextSize                   // 40 px font size
'= no X.Org needed !' 240 250 Ptext
DUP                            // duplicate image reference
'/dev/fb0' PtoFrameBuffer      // display!
Pencode                        // also return a base64 png image

And the first image:

Processing to framebuffer in a Warp 10 extension

Get it from WarpFleet !

I published this extension on WarpFleet. Installation is really easy once WarpFleet is installed. Open a terminal into your Raspberry Warp 10 directory, then get it from Warpfleet:

#install npm 
curl -sL https://deb.nodesource.com/setup_12.x | bash - 
apt-get install -y nodejs
#install WarpFleet
npm install -g @senx/warpfleet
#install extension
cd /opt/warp/
wf g fr.couincouin processingToFramebuffer --confDir=etc/conf.d --macroDir=macros/ --libDir=lib/
Easy as WarpFleet

(How to publish an extension will be my next blog post)

Performances

Since Warp10 2.1, timing is very easy. Surround the function by CHRONOSTART and CHRONOEND, then call CHRONOSTATS to get the result.

'Framebuffer draw' CHRONOSTART
'/dev/fb0' PtoFrameBuffer //display!
'Framebuffer draw' CHRONOEND
Pencode              //also return a base64 png image
CHRONOSTATS

PtoFrameBuffer lasts around 25ms. Nice!

Video in a GTS?

Since Warp 10 2.1, you can store binary values in a GTS... It means you can play a video stored in a GTS. Because why not?

Extract jpg images from the video, encode them in base64 to create a GTS input format with binary in the value:

1// imagesequence{title=introEtchASketch} b64:_9j_4AAQSkZJRgABAgAAAQABAAD__g...
=2// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAIBAQEBAQ...
=3// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAIBAQEBAQ...
=4// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAIBAQEBAQ...
=5// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAIBAQEBAQ...
=6// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAIBgYHBgc...
=7// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAICAgJCAk...
=8// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAICgoLCgs...
=9// b64:_9j_4AAQSkZJRgABAgAAAQABAAD__gARTGF2YzU3LjEwNy4xMDAA_9sAQwAIBAQEBAQ...

In a WarpScript, FETCH your data, and feed a simple decoder:

[ 'readToken' 'imagesequence' { 'title' 'introEtchASketch' } NOW -1000 ] FETCH 0 GET 
SORT VALUES 
<%
  'iso-8859-1' ->BYTES 'imagebytes' STORE
  $background
  $imagebytes Pdecode             // decode jpg
  0 0 Pimage                      // past it on the background
  '/dev/fb0' PtoFrameBuffer       // display!
%> FOREACH
The Warp 10 animation is stored in a GTS...

You can handle 10fps without any kind of optimization.

Conclusion

Warp 10 Processing functions allow to draw any kind of images from within WarpScript. Display an image on a tiny hardware doesn't require X.Org or any kind of graphic acceleration. Just push images in the framebuffer. Writing a WarpScript extension to print an image object to the framebuffer is a one hour effort that saves a lot of time/cpu/ram on your embedded hardware.

That's a nice dashboard, all in WarpScript, without any graphic layer.

If you like this, star us on github !

Share