Monday, 25 July 2022

More MQTT Clients

We setup an MQTT broker on Home Assistant (HA) so that our Lily ESP32 super remote can communicate with it.  MQTT facilitates many different systems to interact and clients can communicate with each other as well as the broker.

Linux MQTT

RPi MQTT client installation is easy, I just install mosquitto-clients onto the RPi and I can send messages at the command line with mosquitto_pub.
If I use the same topic and message as Lily it has the same effect, for example in the example below I publish "button 3" to topic esp32/volume and HA arranges for the volume to be turned up on my Sony Amplifier.



This is great for testing but it is unlikely that I will use the linux command line much.  However I would like to send MQTT messages using a browser, this allows me to send messages from phone/ipad/pc.  

Browser MQTT

The Eclipse Paho MQTT javascript client appears to be a popular choice for a browser javascript MQTT client and Steves Internet Guide provides a very clear example to get a client working.  The browser javascript client communicates with websockets on the MQTT broker and after some searching I found that in addition to the MQTT port 1883, HA supports websockets and the broker websocket listener is on port 1884.

Using Steves Internet Guide it was easy to setup javascript MQTTconnect/disconnect functions, associated with buttons on a simple webpage.  The handlers required to deal with connection success / failure and incoming messages are also simple to implement so I can show the status of the connections and any messages received on a page.  Finally I provided"volume up" and "volume down" buttons to demonstrate that I can now control my amplifier volume by sending MQTT commands to HA so that it can tell the Broadlink IR sender to send volume up/down commands to the amplifier.

The result is the very simple test webpage shown below.  I can now easily implement MQTT functionality into other webpages which need to control devices in the home, in particular my home music server.

It is great that MQTT is a flexible general purpose communications protocol which will work for many different devices.

Webhooks

Early on in my HA investigations I setup webhooks so that I can trigger HA automations from a webpage.  At the time I was more interested in voice control using Google Assistant and webhooks duplicate what is more readily achieved through voice.

However it occured to me that if I can use parameters / arguments with webhooks they make a realistic alternative to MQTT for communication with HA. Webhooks communicate directly with HA rather than needing to connect to the MQTT broker and sending a payload.

HA documentation indicates that it is possible:


Webhooks are implemented using POST requests which I can most easily provide using a linux curl command

I had some difficulty seeing the payload in HA until I added the -H parameter to specify JSON format.  However once this was resolved I could write an automation which displays the payload trigger.json.payload as a HA notification when the message is received.

Of course I want to use webhook URLs in a webpage so I coded a form to send an input text box named payload.  In this case the item containing the information is trigger.data.payload.  Rather than sending an input text field I can add one or parameters to the webhook URL containing directives for HA.  I added a parameter called arg to the webhook URL and could access it in a HA template as trigger.query.arg.  The example below shows both the form item payload and the URL arg being sent from a webpage.  The URL arg is displayed as a HA notification whilst the payload is sent to Google Nest mini to be read aloud.


Implementation


This is great, I can add a variety of fixed and variable information from a web page into a HA webhook automation.
My first implementation is an automation which carries out the same functions as the buttons on the Lily Remote Control to implement volume control and display LED patterns on my programmable LED display.  It is very quick and simple to set this up, only about 15 minutes from concept to testing.




Tuesday, 19 July 2022

LilyGo : Load Images from SD Card

 My LilyGo Genius Remote (smarter than a smart remote) is becoming more sophisticated as I investigate and utilise extra features.  It almost seems strange that I haven't used the SD card previously as programming in the non-Arduino world is often pre-occupied with files.
Lily comes with a demo sketch SD_Test which shows file and directory functions on an SD card so it was an easy matter to incorporate the SD card into my sketch and list / read  files.

As Lily has a small graphics screen, the factory_test demo sketch displays a logo as Lily starts up.  Data for the picture is provided by a header file in the demo sketch containing pixel values which are compiled with the sketch and loaded into a flash memory (PROGMEM) array so they can be displayed using the displayWrite method in the TFT_eSPI library.

To make my own picture I can use a utility program ImageConverter 565 . The screen size is 240w x 135h so my first step is to find an image and, using MS Paint, cut it down to 240 x 135 pixels, which is a rectangular shape.  ImageConvertor will the convert it to C format which is a statement defining a PROGMEM array with 32,400 16 bit data values.  I include this file as a header within my sketch and can then display it on the screen.



I potentially want to display a variety of pictures on the small screen so I would prefer to load the images from my SD card at run-time rather than compile and download them in a sketch.  ImageConverter can also create images in a ".raw" file format. The screen size is 240w x 135h and each pixel is 16 bits.  The RAW file contains no header or format information, just the pixels, so the file created is exactly 240 x 135 x 2 bytes = 64,800 bytes.

I remove the SD card from Lily and copy my image files on to it.  I like the fact that it is exactly 64,800 bytes, nothing at all is added to the file.  To load the image I simply have to read the file into an array and display it.  The ScreenBuffer array is 32,400 bit values so I have to read in two bytes to each array element.  I can then load an image whenever I want.



I cant stress enough what a wonderful hardware package LilyGo is for ESP32 development.  I think LilyGo are a maker / hobbyist company, but this device is potential usable in shops as a hand-held terminal.  I hope that this box or a similar one continues to be available.  As an added bonus the demo sketches show you how to use all the functions so you don't have to spend time looking for datasheets, pin numbers, libraries etc; making it childsplay to provide more functions.  I love it.

Monday, 11 July 2022

LilyGo Remote

 Previously I spent some time setting up my wonderful new LilyGo T-Display Keyboard functions, mainly to control my music server, similar to its pre-decessors.  It has a lot more potential and I have been starting to add features.

Screen Saver

As Lily works on battery when not connected to USB there is a limited amount of time before it needs to be recharged.  It is sensible to turn off the screen when not in use.  The factory_Test sketch which was provided with Lily shows how to turn the display off and put ESP32 into deep sleep mode


The first two commands DISPOFF and SLPIN blank Lilies LCD display and turn off power to the LCD.  I struggled to find documentation for these commands.  In fact they are well-documented in the ST7899V datasheet which corresponds to the LCD.  I can turn the screen back on with DISPON and SLPOUT commands.

So now I need to setup a proper screen saver.  It should wait until Lily has been inactive for a short while, say a minute and power down the screen.  When a key is pressed the screen should be powered on, allowing the user to continue.

I need a timer function to do this.  There are generic Arduino timer libraries but it is better to use the ESP32 timer function.  We set up a timer with an alarm so that the screen blanks after 10 seconds (during testing, 1 minute for real use).  If the user presses a key either before or after the screen blanks the timer is reset and the screen is restored so they can continue. 


This works very well, keyboard input is still possible when the screen is off so there is no delay in waiting for Lily to wakeup.  I have added some menus for favorite albums, radio stations and chart playlists so it is useful to have the screen on for this.  For some of the other functions I dont usually need the screen.

Hibernation

If Lily is not in use it can be put into a deep sleep where wifi is turned off and the CPU is using little current.  Initially I investigated shutting down functions before sleeping then waking up with an external interrupt when a button was pressed.  However I decided to simply set up a timer so that the ESP32 goes straight into deep sleep after an hour of inactivity with no wakeup capability.  If Lily is in use it is likely that a key will be pressed within an hour.  It only takes 5s-10s for Lily to start up so pressing the restart button on first use isn't an issue.

With Screen Saver and Hibernation working Lily is behaving like a real computer!  The battery lifetime is now at least two or three days.

Saturday, 9 July 2022

RISC-V Assembly input

 Previously I have managed to write an assembly program to display a range of memory locations.  A small next step is to work out how accept input to the program.  As I am using GLIBC these functions should be straightforward, in fact the simplist approach is to write a C program, see how the compiler has converted it to assembly then adapt it for my own program.

Character input

Step 1 is to read in a character from the terminal. Linux uses "blocking" input by default which means that you need to hit Enter after the character(s) before they are processed.




Command line arguments

Step 2 is to read input from command line arguments into the program.  Linux puts the arguments onto the stack for me - probably as part of the crt0 initialisation so I just need to decode the structure from the stack.


String Input

Step 3 is to read in a string from the terminal.  I can either use the stack to store the string or data storage within the program, I checked that both worked.

Non-blocking input

Finally I tried non-blocking input.  This required a bit more investigation within the C environment.  I found a beautifully clear tutorial written by Paige Ruiten as part of his snaptoken project to implement the kilo text editor.  I tinkered with his example to minimise it so that I could look at the assembly.

When someone shows you what to do it isn't too difficult but I wouldn't really like to work it all out for myself.  In fact I didn't implement this in assembly, but it is all ready for when I want to use it.

Libraries

Whilst working with these C and assembly programs I did think about how I use libraries.  I am using GLIBC in my assembly programs because I dont want to write lots of I/O routines but generally speaking this means I don't need to write assembly myself since my programs could be written in C.

However, using RISC-V, one of the benefits is being able to use native RISC-V assembly so I will continue to try a little bit.

If I write anything significant I should put it in a library.  I followed a very good opensource.com tutorial to familiarise myself with creating and using a Linux library.

HA : ESPHome : RFID Reader

 Previously, I installed ESPHome with a simple indicator, showing whether a GPIO pin was hi or lo.
Many devices we use with Home Assistant (HA) have specific home automation interfaces.  ESPHome extends HA functionality by providing an interfaces for many more sensor components which can measure the environment somehow.  ESPHome provides the capability to interact with RFID readers and cards / tags.

I purchased three RC522 readers on Ebay and they turned out to be very simple to setup.

I am using a ESP32-VROOM-32 as my ESPHome server device and it communicates with RC522 using SPI.  ESPHome is configured using HA.  Firstly we define the pins to be used for spi.  We can then add the pin required for RC522.  Using the ESPHome UI we tell HA to install this configuration and it spends a couple of minutes compiling an image and downloading it wirelessly to the ESP32.  I connected the 4 data pins plus 3V3 and GND from the ESP22 to the RC522 and ESPHome showed that it was communicating 😀😀😀


 The next stage is to present a keyring-tag or card to the reader.  When you do this the ESPHome console log shows the id.  You can add each tag uid as a binary sensor within ESPHome and then look at them on a dashboard.

Once we have binary sensors we can set up a HA automation which is triggered whenever the binary sensor state is changed.  My  initial experiment instructs the Google Nest Mini speaker to inform me when Tag 1 is presented to the reader and when it is removed.

There are ways we could use the cards.  I could present a card to the reader when I enter or leave the room and everything could be setup for me.  Baby Harry could have a variety of cards to do things when he wants them. 
I did check whether NFC on my phone is acceptable to the reader.  It does register and send a tag id to HA, but the tag is different each time, possibly as it uses more sophisticated security, so it isn't much use to me.