Tuesday, June 25, 2019

Mesh Security System (Argon Hub, OLED and MP3 Shields) - Part 2

OLED Display


Figure 5. Argon mounted on Tripler with OLED.


Having demonstrated that we can blink a LED on the Argon, we now want to move onto something a bit more useful. The Argon will form the hub of the Mesh Security System and will connect to an OLED and MP3 shield to indicate system status. In Part 2 we will get the OLED and MP3 shields working.

As shown in Figure 5, connection is simple using the Featherwing Tripler. By mounting the shields horizontally rather than stacking them you can still easily see all the indication LEDs. You will have to solder the headers on the tripler and shields. Do the tripler first. I solder one pin and then check that the header is correctly positioned before soldering the rest. It is a lot easier to rectify an issue with only one pin soldered in place. Once you have completed soldering the headers on the tripler you can use this as a jig to hold the pins in place when soldering them to the shields. This will ensure that the shield pins line up with the headers on the tripler.

Figure 6. OLED Operational


The display board is 128x32 monochrome OLED which has 3 user buttons plus reset. This screen is made of 128x32 individual white OLED pixels and because the display makes its own light, no backlight is required. This reduces the power required to run the OLED and is why the display has such high contrast. The board uses a SSD1306 and connects via I2C (pins D0 and D1), so it is very pin frugal. As I2C is a shared bus you can have other shields which utilise I2C connected at the same time (as long as they have different I2C addresses). The three buttons use:
ButtonPinNotes
AD4No pull-up. Can't be used with Ethernet.
BD3100K pull-up. Can't be used with Ethernet.
CD2No pull-up.
So all up this shield uses 5 pins (D0 - D4).

The library is available in the Web IDE as oled-wing-adafruit and using the display from the Argon is easy. The library takes care of setting the appropriate input modes and debouncing the buttons for you.

I've reproduced my test code stub below. I always like to get each element of a project working before adding the next. This makes debugging much easier.



MP3 Shield


The MP3 Shield is shown in Figure 5 above. This is before the through hole headers have been soldered onto the shield. The shield version that we are using is the Adafruit Music Maker FeatherWing. This shield uses the the VS1053, an encoding/decoding (codec) chip that can decode a wide variety of audio formats such as MP3, AAC, Ogg Vorbis, WMA, MIDI, FLAC, WAV (PCM and ADPCM). This chip also allows you to adjust bass, treble, and volume digitally.

Figure 7. Argon Block Diagram (showing I/O).


Communication is via a SPI interface which allows audio to be played from an SD card. There's also a special MIDI mode that you can boot the chip into that will read 'classic' 31250 Kbaud MIDI data from the UART TX pin. The hardware SPI pins are needed whenever you are transmitting data from the SD card to the decoder chip. If you are using the wing in the special MIDI mode, they're not used.

D11: SPI MISO - connected to MISO - used by both the SD card and VS1053
D12: SPI MOSI - connected to MOSI - used by both the SD card and VS1053
D13: SPI SCK - connected to SCK - used by both the SD card and VS1053

The Adafruit VS1053 Library does include a constructor to define the SPI pins you want to use, but this doesn't help us because:

  1. The hardware SPI pins are already connected by the tripler; and
  2. The alternative SPI pins on the Argon are D2, D3 and D4 - which seem to be very popular with shield designers!


Figure 8. Adafruit Music Maker FeatherWing Shield.


Next are the control pins required to play music. From left to right, in Figure 9 below, they are:

MP3_DCS - this is the VS1053 data select pin
DREQ        - this is the VS1053 data request interrupt pin
MP3_CS    - this is the VS1053 chip select pin
SD_CS       - this is the SD Card chip select pin

Figure 9. MP3 Control Pins.


Unfortunately the MP3 control pins connected (via the tripler) to the Argon conflict with the A, B and C buttons connected to D2, D3 and D4 from the OLED shield. Thankfully there is no conflict on pins D0 or D1, so we can still control the OLED with the MP3 shield in place. Obviously the designers of the two shields at Adafruit didn't talk to each other!

Figure 10. MP3 Shield Installed.


To summarise, the Argon pins used to control the MP3 shield are:

SD_CS                = D2;                 // SD Card chip select pin
MP3_CS             = D3;                 // VS1053 chip select pin (output)
DREQ                 = D4;                 // VS1053 Data request, ideally an Interrupt pin
MP3_DCS          = D5;                 // VS1053 Data/command select pin (output)
SPI MISO           = D11;               // used by both the SD card and VS1053
SPI MOSI           = D12;              // used by both the SD card and VS1053
SPI SCK             = D13;               // used by both the SD card and VS1053

Figure 10 shows the MP3 shield in place on the tripler adjacent to the OLED shield. To give myself a bit more room, I removed the OLED shield while soldering the header pins to the MP3 shield. I again inserted the header pins into the tripler before soldering to make sure that everything lined up.

There are two versions of the Adafruit Music Maker, one includes an amplifier and the other just has a 3.5mm connection for headphones or powered speakers. In retrospect I should have got the one with the amplifier built in. Nevertheless I happen to have a Duinotech 2 x 3W amplifier, so I might as well use that. This is the red PCB shown in Figure 10. Before dealing with this, you will want to make sure that the MP3 shield is working.

Thankfully ScruffR has done the hard work of porting the Adafruit VS1053 Arduino library to work with Particle mesh boards. You will need to import this library and the SDFat library in order to get the shield working. This is easy, just search for the libraries in the Web IDE and then add them. Plug in some headphones (assuming you have the same version shield as I do) and you can use the code below to test the operation of your shield. You will obviously need to copy some mp3 files to SD card before you can play them. Make sure that the names of the files are in the 8.3 format or they wont be able to be played.



Duinotech 2 x 3W Amplifier


Rather than use the 3.5mm jack on the MP3 shield, we will connect directly to the Ground, Right and Left pins next to the headphone jack (Figure 11). They are line level, AC coupled outputs which are suitable for connection to an amplifier.

Figure 11. MP3 Shield Audio Out Pins.


The Duinotech 2 x 3W Class D Amplifier (Figure 12) has greater than 90% efficiency and typically delivers 3W into 4 ohm speakers (or 1.5W into 8 ohms). Its operating voltage range is 2.5 to 5.5 VDC.

The amplifier board uses the PAM8403 chip and power output will be determined by a combination of the input voltage supplied and output impedance. As we are using the regulated 3.3V from the Argon and 8 ohm speakers our expected power output from the amplifier is around 0.5W.

Figure 12. Duinotech 2 x 3W Amplifier.

The amplifier pin out description is provided in the table below.

Amplifier Pinout
Module
Function
R+/R-
Right Speaker
L-/L+
Left Speaker
GND
Ground Connection
+5V
Power Supply
5W
Shutdown Control
GND
Ground Connection
LIN
Left Audio In
GND
Ground for Audio
RIN
Right Audio In

Connection between the MP3 shield and amplifier is straight forward.
  1. MP3 Shield L and G connect to LIN and Audio GND on the amplifier.
  2. MP3 Shield R and G connect to RIN and Audio GND on the amplifier.
  3. R+/R- on the amplifier connect to the right speaker.
  4. L+/L- on the amplifier connect to the left speaker.
  5. +5V and GND on the amplifier connect to the 3.3V and GND pins on the Argon.
In Part 3 we will complete construction of the Argon Hub and 3D print an enclosure for it. We will then move onto configuring the Xenon's.

Tuesday, June 18, 2019

Mesh Security System using the Particle Argon and Xenon - Part 1

Introduction


Figure 1. Argon Board plus some other bits and pieces.

I wanted to learn about the (relatively) new mesh capable boards from Particle, and decided a good project for this would be a mesh security system for our two garages and carport. These are some distance from the house and so should provide a good test of the mesh network range.

The system design will look something like Figure 2. The three Xenon's will communicate via the RF mesh to each other and to the Argon Hub. The Argon will monitor the state of the Xenon's and indicate the system status using an OLED and MP3 shield. The Argon will also connect to our LAN using WiFi and provide more detailed security status via  a web page. If you were building a for real security system it probably wouldn't be a good idea to publish the details on the internet.

Figure 2. Mesh Security Block Diagram.

For this first article we will focus on getting the Argon up and configured. Subsequent articles will focus on the Xenon's.

Particle Xenon and Argon Boards


The Xenon is a low cost mesh-enabled development kit that can act as either an endpoint or repeater within a Particle Mesh network.

The boards are based on the Nordic nRF52840 SoC (System on a Chip), and communicate using the IEEE 802.15.4-2006 standard to create a PAN (Personal Area Network). Bluetooth and active Near Field Communication (NFC) is also available. They have built-in battery charging circuitry which makes it possible to connect and recharge an appropriately sized Li-Po battery. The Xenon has 20 mixed signal (6 x Analog, 8 x PWM) GPIOs to interface with sensors, actuators, and other electronics. Programming it is very similar to an Arduino. The board is compatible with the Adafruit FeatherWing layout and shields can be connected to the base board using a FeatherWing doubler or tripler.

The Particle Argon is similar but includes Wi-Fi. It can be used as a standalone Wi-Fi device or as a Wi-Fi enabled gateway, repeater, or endpoint for Particle Mesh networks. We will be using it in the second configuration for our network.

The Argon has both the Nordic nRF52840 and the Espressif ESP32 processors on board. As with the Xenon it has battery charging circuitry and 20 mixed signal (6 x Analog, 8 x PWM) GPIOs. Other interfaces include UART, I2C, and SPI.

Programming the boards can be done via an extension to Visual Studio Code or using their online IDE. We will try out both methods.

Connecting the Antenna


The Argon uses two different MCU's for WiFi and BLE/Mesh. The WiFi is done using the ESP32 capability and the BLE/Mesh via the nRF82540. Each communication method uses the following frequencies:

  1. WiFi - 2.412 GHz to 2.484 GHz (14 channels)
  2. Bluetooth - 2.400 to 2.485 GHz
  3. Mesh - 2.4 GHz (uses 6LoWPAN over 802.15.4)
  4. NFC - 13.56 MHz

So there is a lot going on around 2.4 GHz if you are using WiFi, BLE and mesh at the same time. This is probably why an external antenna is provided. I assume there is also some smart deconfliction occurring at the hardware or Device OS level.

When talking about the Particle Mesh you may see Thread referenced. Thread is an open mesh networking protocol released by the Thread Group. Particle Mesh uses OpenThread, an open source implementation of Thread released by Nest.

6LoWPAN is an unfortunate acronym that combines the latest version of the Internet Protocol (IPv6) and Low-power Wireless Personal Area Networks (LoWPAN). 6LoWPAN, therefore, allows for the smallest devices with limited processing ability to transmit information wirelessly using an internet protocol. It is a competitor to ZigBee.

Figure 3. Particle 2.4 GHz Antenna

The Argon has 3 antenna connectors (u.FL connector); two on top “BT” (for mesh - nRF52840) and WiFi (for the ESP32), and one on the underside (under the micro-USB connector) for NFC. The Xenon's have 2 antenna connectors; one for “BT” (mesh) and one on the underside for NFC.

The Antenna provided with the Argon is tuned for 2.4GHz so use it for Mesh or WiFi. If you are using NFC, you will need to purchase an antenna tuned for 13.56 MHz. I am going to start out with the external Antenna on WiFi. This is required if you wish to use the WiFi connectivity.

There are two options for the Mesh antenna on the Argon. It comes with an on-board PCB antenna which is selected by default in the device OS and a u.FL connector if you wish to connect an external antenna. If you wish to use the external antenna, you'll need to buy one and issue an appropriate command in the firmware.

Connecting the antenna plug to the u.FL socket on the Argon is easiest done using a pair of long nosed pliers.

First Time Setup


Particle have put together a good video showing how to setup your Argon, so there is no need to reproduce all the steps here. TL;DR - download the iOS or Android app and follow the instructions.

As part of registering your device you will probably have to update the device OS (which abstracts away some of the complexity of programming the Argon). It is all very straight forward and worked well for me. I like the use of the RGB LED to indicate the various states of the device.

Once you've completed the setup you will be able to program your device and send over-the-air (OTA) updates to it.

Flashing the standard blink "hello world" example is a trivial exercise using the Web IDE. I was impressed by how simple this all was. The devs at Microsoft Azure IoT could learn something from this! Next up we will try something a bit more challenging - connecting the OLED and MP3 shields using the FeatherWing Tripler.

Figure 4. FeatherWing Tripler

Saturday, June 8, 2019

Programming the Tello Drone using Swift (Part 1)

The Tello Drone




In this article we will explore how to write a simple iOS app in Swift to allow control of the Tello.

Tello is a mini drone equipped with a HD camera that is manufactured by Ryze Robotics and includes a flight controller with DJI smarts. It is a great drone to learn to fly on as you can use it indoors and because it is so light (80 grams), crashing is fairly painless if you have the prop guards on. I have crashed mine (a lot) and the worst that has happened is that a propeller came off, which is easy to replace. It is also relatively inexpensive. You can manually control it using either an app (iOS or Android) on your phone, or a combination of the app and a dedicated Bluetooth remote. Either works fine. If you do get the Bluetooth remote be careful of not moving out of Bluetooth range of your phone while you are flying the drone.

Tello Specifications


Tello is Powered by a DJIGlobal flight control system and an Intel processor (Movidius MA2x chipset). The MA2x is based on a SARC LEON processor which has two RISC CPUs to run the RTOS, firmware, and runtime scheduler (Ref: RyzeTelloFirmware). The other specifications are:

  • Weight: Approximately 80 g (Propellers and Battery Included)
  • Dimensions: 98×92.5×41 mm
  • Propeller: 3 inches
  • Built-in Functions: Range Finder, Barometer, LED, Vision System, 2.4 GHz 802.11n Wi-Fi, 720p Live View
  • Port: Micro USB Charging Port
  • Max Flight Distance: 100m
  • Max Speed: 8m/s
  • Max Flight Time: 13min
  • Max Flight Height: 30m

Programming - Firmware Versions


Apart from being a good platform to earn your flying chops, the best thing about the Tello from my perspective is that you can write a script or a program to control the drone remotely. This opens up a lot of possibilities.

Note that there are three different Tello's that you can buy (the Tello, the newer Tello EDU and the Ironman Edition), and they use slightly different API's. So make sure that you use the appropriate version for your drone.

You can work out which firmware you have by connecting your mobile to the Tello WiFi, opening the Tello app, tapping on settings (the gear icon), then tap on the More button, and finally tap on the "..." button to the left of the screen. This should bring up the screen shown below which includes the firmware and app version numbers. My Tello is running firmware version 1.03.33.01. You can download the relevant SDK document for this version.



The Tello EDU uses version 2.0 of the SDK. You can download a PDF of the V2 SDK from here.

Commands that are available in SDK v1.3 but not v2.0 are:

  • height?
  • temp?
  • attitude?
  • baro?
  • acceleration?
  • tof?

Conversely, commands that are available in SDK v2.0 but not v1.3 are:

  • stop (hover)
  • go x y z speed mid (same as go x y z speed but uses the mission pad)
  • curve x1 y1 z1 x2 y2 z2 speed mid (same as curve x1 y1 z1 x2 y2 z2 speed but uses the mission pad)
  • jump x y z speed yaw mid1 mid2 (Fly to coordinates x, y and z of mission pad 1 and recognize coordinates 0, 0 and z of mission pad 2 and rotate to the yaw value)
  • mon
  • moff
  • mdirection
  • ap ssid pass
  • sdk?
  • sn?

The Tello EDU also has a swarm mode if you want to control a bunch of drones.

Programming - Python


There are plenty of examples on how to use Python to control your Tello. For drones running v1.3 have a look at the DroneBlocks code. For the Tello EDU (i.e. v2.0 SDK), Ryze Robotics provide some sample code for you to download and try out.

I uploaded the DroneBlocks code using my Raspberry Pi connected to the Tello WiFi and it worked a treat. Given that there are lots of Python examples, I thought I would put together something in Swift and work up to an app which provides additional functionality not found in the official Tello app.

Programming - Swift (iOS)


We access the Tello API by connecting to the airframe via a WiFi UDP port. Once a connection is in place, the drone is controlled using simple text commands.



The first thing we want to determine is whether our device is connected to the Tello WiFi. There are a couple of Swift functions which can assist with establishing this. The Tello SSID name contains the string "TELLO" (see image above), so this is what we will use to determine wether we are connected to the correct WiFi network.


We can use the code above in our ViewController to ensure that we are hooked up to the Tello, and if not provide an alert. The screenshot below shows this implemented in my proof of concept app.


The code for the ViewController is shown next. It should be fairly self explanatory.



UDP


UDP (User Datagram Protocol) is a communications protocol, similar to Transmission Control Protocol (TCP), but used primarily for establishing low-latency, low-bandwidth and loss-tolerating connections. UDP sends messages, called datagrams, and is considered a best-effort mode of communications. With UDP there is no checking and resending of lost messages (unlike TCP).

Both UDP and TCP run on top of the Internet Protocol (IP) and are sometimes referred to as UDP/IP or TCP/IP.

UDP provides two services not provided by the IP layer. It provides port numbers to help distinguish different user requests and, optionally, a checksum capability to verify that the data arrived intact.

The Tello IP address is 192.168.10.1. The UDP Services available are:

UDP PORT: 8889 - Send command and receive a response.
UDP SERVER: 0.0.0.0 UDP PORT: 8890 - Receive Tello state.
UDP SERVER: 0.0.0.0 UDP PORT: 11111 - Receive Tello video stream.

If you want to send and receive via UDP on iOS then the two main libraries in use appear to be SwiftSocket and GCDAsyncUDPSocket.

Swift Socket looks to be the simpler of the two libraries, so I used that for my initial attempt. I put together a Tello Swift class to do the heavy lifting. It is reproduced below and works as advertised. You will need to put together your own UI but if you hook up the relevant buttons in the View Controller then you shouldn't have any problem reproducing what I have done.

I will add a bit more functionality to the app (e.g. video) and then stick it up on the app store for download.



Friday, December 21, 2018

Arduino Sonar Display using Processing - Radar & Waterfall

Introduction





The PING or its cheaper clone the HC-SR04 are often used in robotics as a means of obstacle detection. For some time now I have been meaning to put together a means of visualising what the sensor is detecting. This is useful in diagnosing the performance of your robot as it moves around its environment.


The Hardware


To read the HC-SR04 data and control the pan servo I used an Arduino Uno variant (the DFRobot Romeo BLE) that I already had. This is programable via Bluetooth but this isn't necessary, any vanilla uno will do. My setup also included a tilt servo, but this isn't used currently.



I 3D printed a mount for the Arduino which also provides a base for the servos and ultrasonic sensor. The Arduino sketch is very straight forward. It pans the servo from 10 to 170 degrees, with 90 degrees being straight ahead, and sends the current angle and distance to any obstacle (called the range) out on the serial port every 1 degree travelled. This code is reproduced below. The Servo and NewPing libraries do most of the heavy lifting.


/**********************
 @file    Sonar_Visualisation.ino
 @brief   Create visual representation of a sonar sweep using Processing.
 @author  David Such

Code:        David Such
Version:     1.0 
Last edited: 04/11/18
**********************/

#include < Servo.h > 
#include < NewPing.h >

  //  DEFINITIONS

#define MAX_DISTANCE 30
#define MAX_ANGLE 80
#define ANGLE_STEP 1

//  PIN CONNECTIONS

const byte TRIG_PIN = 2;
const byte ECHO_PIN = 3;
const byte H_SERVO = 9, V_SERVO = 10;
const byte LED_PIN = 13;

//  GLOBALS

int angle = 0;
int dir = 1;

//  CREATE CLASS INSTANCES

Servo hServo, vServo;
NewPing sonar(TRIG_PIN, ECHO_PIN, MAX_DISTANCE);

//  METHODS

void centre(Servo servo, int offset) {
  digitalWrite(LED_PIN, !digitalRead(LED_PIN));
  servo.write(90 + offset);
  delay(15);
  digitalWrite(LED_PIN, !digitalRead(LED_PIN));
}

void sweep(Servo servo, int min, int max) {
  int pos = 0;

  min = constrain(min, 0, 180);
  max = constrain(max, min, 180);

  digitalWrite(LED_PIN, !digitalRead(LED_PIN));
  for (pos = min; pos <= max; pos += 1) {
    servo.write(pos);
    delay(15);
  }

  digitalWrite(LED_PIN, !digitalRead(LED_PIN));
  for (pos = max; pos >= min; pos -= 1) {
    servo.write(pos);
    delay(15);
  }
}

void sendSerialPacket(int angle, int distance) {
  Serial.print(angle);
  Serial.print(",");
  Serial.println(distance);
}

//  MAIN

void setup() {
  Serial.begin(115200);

  pinMode(H_SERVO, OUTPUT);
  pinMode(V_SERVO, OUTPUT);
  pinMode(LED_PIN, OUTPUT);

  digitalWrite(LED_PIN, HIGH);

  hServo.attach(H_SERVO);
  vServo.attach(V_SERVO);

  centre(hServo, 0);
  sweep(vServo, 45, 90);
  centre(vServo, 5);
}

void loop() {
  delay(40);
  unsigned int ping_distance_cm = sonar.ping_cm();

  ping_distance_cm = constrain(ping_distance_cm, 0, MAX_DISTANCE);
  sendSerialPacket(angle, ping_distance_cm);
  hServo.write(angle + MAX_ANGLE);

  if (angle >= MAX_ANGLE || angle <= -MAX_ANGLE) {
    dir = -dir;
  }

  angle += (dir * ANGLE_STEP);
}

Processing 3


Processing is a language and IDE designed for visual display. The language is VERY similar to that used for programming the Arduino and is a c variant. It is perfect for displaying data from the Arduino and this is what we used for our sonar display.



Processing is available for free and there are versions for Windows, MAC and Linux. It also comes as standard on Raspbian and so we used a Raspberry Pi to run our processing sketch and display the output. The same sketches will work on what ever OS you are using, you will just need to change the name of the USB port.

One thing you normally need to consider when connecting serial data is what voltage levels are being used. For example the Raspberry Pi uses 3.3V logic on its I/O and the UNO uses 5V. Connecting these directly could damage the Raspberry Pi. By using the USB ports, voltage conversion is handled by the boards and we don't have to worry about it.

So to get the serial data from the UNO to the Raspberry Pi we just connect the appropriate USB cable between the two boards.

The Raspberry Pi also comes with the Arduino IDE so you can even program the UNO using this if you want, using the same USB cable. Upload the Arduino code first. You can then use this data to debug your processing sketches.

Sonar Displays


I wrote 3 Processing sketches to display the data in different ways. The first is based on the design done by Tony Zhang at hackster.io, I liked his pseudo radar display and wanted to emulate it. Note that I have significantly modified his sketch as it seems to be unnecessarily complicated and includes a bunch of unused code for some reason. You can download the Sonar Display Processing Sketch. Note that all 3 of the sketches use the integer point class which you can also download from the Reefwing Gist.



The second display is my attempt at a waterfall display, similar to that used on submarines to display sonar data. It turned out more like a depth sounder display, but I like the use of perlin noise to represent the outer limit of the sonar range. Download the Depth Display Processing Sketch.



The third display is a combination of the first two displays, which I called the Range Display Processing Sketch.



Wednesday, August 8, 2018

Espressif ESP32 Tutorial - IR Remote Control using Microsoft Azure




The Project


This tutorial will outline how to create an IR Remote using the ESP32 and then control it from the IoT hub on Microsoft Azure.

Driving an IR remote transmitter using an Arduino is simple, as there is a library, called IRremote.h which does all the hard work. You just need to connect your IR transmitter module signal pin to the appropriate Arduino pin, via a current limiting resistor and you are done. Connecting an Arduino to the cloud takes a bit more work (depending on the model you are using), which is why we wanted to use the ESP32.

Unfortunately, the standard IRRemote.h Arduino library only supports receiving IR signals on the ESP32 not transmitting them. Fortunately, Andreas Spiess has forked the standard library and added ESP32 transmission capability. You will need to download the ESP32-IRremote library, so we can use it with the ESP32. Andreas did this by using ledC PWM. You can now select any pin to use with IRsend(pin). Note this is only for the ESP32, the other board types have defined pins you have to use due to the assigned timers.

The Duinotech Infrared Transmitter Module


The IR transmitting module which I used is the one from Jaycar (branded Duinotech). There is a data sheet available on the Jaycar site but it is fairly sparse and doesn't clearly define the pins on the module.

It appears that this module is based on the KY-005 INFRARED TRANSMITTER MODULE. The specifications for which are:


Operating Voltage 5V
Forward Current 20 ~ 60 mA
Power Consumption 90mW
Operating Temperature -25°C to 80°C [-13°F to 176°F]
Dimensions  18.5mm x 15mm [0.728in x 0.591in]

This being the case, the pin out is as follows:



The signal pin is clearly labeled with an S, the middle pin is GND via a resistor (* which you have to fit yourself to the module) and GND is connected to the third pin (with the "-" adjacent to it). This module is just an infrared diode (which emits at a wavelength of 940 nm).

Thus, we can drive it like any other diode via a current limiting resistor. The value of the resistor depends on what voltage your micro controller digital outputs (DO) are switching, the desired diode forward current and the forward voltage drop characteristic of the diode. So for our design:

VDO = 3.3V
If = 20 mA
Vf = 1.2V (nominally 1.1V but I measured this using my LCR meter)

Then, R = (VDO - Vf) / If
              = (3.3 - 1.2) / 0.02
              = 105 Ω



We will use a 100 Ω resistor in our circuit.



The CIR (Commercial Infrared) Transmission Protocol


As there are usually other sources of infrared radiation (e.g. sunlight and incandescent or LED lights), the 940nm IR transmitter is modulated by a carrier frequency in the 32-40 kHz range. CIR receivers incorporate a bandpass filter tuned to this carrier frequency. This allows the receiver to discriminate between the modulated IR signal and any ambient, unmodulated IR. In effect, the IR receiver is double tuned both to the wavelength of the IR radiation and to the carrier frequency.



Three factors influence CIR range. In order of decreasing importance they are: the power level of the IR emitter, the IR wavelength, and the carrier frequency. An IR emitter's output is proportional to the current through the emitter. Increasing the current will increase the power. Because the duty cycle is usually 50% or less, the emitter can be driven with quite high currents. For optimal range, the IR wavelength of the emitter and receiver should match.



A similar protocol to CIR is IrDA. IrDA was popular in the late 1990's but has largely been replaced by Bluetooth and WiFi. IrDA was designed to be very short range (< 1 m). It does not use any secondary carrier but directly modulates the 850nm IR with the data. Because of this, it is susceptible to interference from ambient IR. In addition, the IrDA transmitter is usually lower power than a CIR transmitter.

An IrDA transmitter with a CIR receiver is a mismatch as is a CIR transmitter with a IrDA receiver. They operate on a different wavelength (940 nm vs 850 nm), IrDA isn't modulated, and the beam angles are different (IrDA limits the beam angle to ±15° while most CIR emitters are ±40° or greater). Such mismatches have major effects on range and reliability.

We will use CIR in our design.

For RF control, both the transmitter and receiver need to be tuned to the same carrier frequency and need to use the same type of modulation. Most RF remotes use ASK (Amplitude Shift Keying) or OOK (On-Off Keying). OOK is really just a special case of ASK. OOK is also called CPCA (Carrier Present, Carrier Absent). You can have a look at the IRremote library to see how this coding is achieved.

Microsoft Azure




Azure is Microsoft's catch all name for their cloud services. It covers over 100 different services. The service of interest to us is IoT Hub. You can use Azure IoT Hub to securely connect, monitor and manage billions of devices to develop Internet of Things (IoT) applications. To get started we will connect just one device!

You will need to sign up for a free Azure account. Follow the link above and do this. For some reason Microsoft make you provide credit card details, even for the free account. Note that when you sign up, the email address you provide becomes the name of the default active directory (which wouldn't be my first preference).

It will be interesting comparing the Microsoft IoT hub functionality with node-red, which is another dashboard option that we have had experience with. At this stage I suspect that node-red is much cheaper (free) and simpler but Azure is more robust, secure and scalable. The key features of IoT hub are:
  1. Bidirectional communication with LOTS of devices. Use device-to-cloud telemetry data to understand the state of your devices and define message routes to other Azure services without writing any code. In cloud-to-device messages, reliably send commands and notifications to your connected devices – and track message delivery with acknowledgement receipts. Device messages are sent in a durable way to accommodate intermittently connected devices.
  2. Authentication per device. Set up individual identities and credentials for each of your connected devices, and help retain the confidentiality of both cloud-to-device and device-to-cloud messages. To maintain the integrity of your system, selectively revoke access rights for specific devices as needed.
  3. Automated device registration. Speed up your IoT deployment by registering and provisioning devices with zero touch in a secure and scalable way. IoT Hub Device Provisioning Service supports any type of IoT device compatible with IoT Hub.
  4. Use IoT Edge. Take advantage of IoT Edge to make hybrid cloud and edge solutions. IoT Edge provides orchestration between code and services so they flow securely between cloud and edge to distribute intelligence across a range of devices. Enable artificial intelligence and other advanced analytics at the edge.
Once you have signed up for Azure, you will be presented with a dashboard similar to that shown above.

Create an IoT Hub




Microsoft call their menus "blades" in Azure. No idea why, maybe because it sounds cooler than menu? Anyway, click on the + Create a resource link on the blade to the left of the dashboard. This will open the Azure Marketplace.

In the Marketplace, click on Internet of Things. This will provide a new list of menu options to the right.


We want IoT Hub at the top. Click on this to setup your hub. For subscription select Free Trial and for Resource Group, Create new.

The free tier is intended for testing and evaluation. It allows 500 devices to be connected to the IoT hub and up to 8,000 messages per day. Each Azure subscription can create one IoT Hub in the free tier.

A resource group is a container that holds related resources for an Azure solution. The resource group can include all the resources for the solution, or only those resources that you want to manage as a group.

Select the Region closest to your location. In Australia the options are East and Southeast, which I think refer to Sydney and Melbourne respectively.

To create an IoT hub, you must name the IoT hub. This name must be unique across all IoT hubs. The IoT hub will be publicly discoverable as a DNS endpoint, so make sure to avoid any sensitive information while naming it. Once created, the name can't be changed.

Click the button at the bottom labelled - Next: Size and Scale >>



For pricing and scale tier, select F1: Free tier. That is all that you can adjust on this screen. Click Review + create.

When all previous steps are complete, you can create the IoT hub. Click Create to start the back-end process to create and deploy the IoT hub with the options you chose.

It can take a few minutes to create the IoT hub as it takes time for the back-end deployment to run on the appropriate location servers. Once your new IoT resource has been created, you can customise your dashboard.


Add an IoT Device



Before a device or module can connect to your IoT hub, there must be an entry for that device in the IoT hub's identity registry. A device must also authenticate with the IoT hub based on credentials stored in the identity registry. The device or module ID stored in the identity registry is case-sensitive.

To add a new IoT device, click on + Add, and the Add Device blade will be displayed.

  • Device ID: A case-sensitive string (up to 128 characters long) of ASCII 7-bit alphanumeric characters.
  • Authentication Type: Symmetric Key or X.509 Certificate. I used Symmetric Key. The differences are:
    • Symmetric Key: a unique identity key (security tokens) for each device, which can be used by the device to communicate with the IoT Hub.
    • X.509 Certificate: uses an on-device X.509 certificate and private key as a means to authenticate the device to the IoT Hub. This authentication method ensures that the private key on the device is not known outside the device at any time, providing a higher level of security.
  • Auto Generate Keys: tick.
  • Connect device to IoT hub: enable.

Click on Save, and your new device will be added to the hub. Click on the device ID of the newly added device to see the security keys and connection strings. You will need the device ID and a copy of the primary connection string for insertion into your ESP32 sketch. Now onto the ESP32.

ESP32 Software


You can download a copy of my ESP32 sketch from the Reefwing Gist.  You will need to fill in your SSID, password and primary connection string where indicated.



I spent quite a bit of time trying different tool chains to get everything configured and talking. My initial preference was to use Eclipse with the Arduino tool chain. This would give me a proper IDE and a remote control library that I knew was compatible with all the Arduino's out there.

Unfortunately, importing custom libraries is a bit problematic for the two Eclipse Arduino plug-ins available. It is theoretically possible but I ran out of patience trying to get it to work. The ESP-IDF has its own remote control library but it is not as well documented as the Arduino library. I'm also not familiar with coding the ESP32's natively.



While looking for a way to receive the messages sent from the ESP32 to the cloud I discovered a plug in for Visual Studio Code. It so happens that there is also a plug in for Arduino. Since I was already using this to monitor my IoT hub traffic, I decided to give it a crack with programming the ESP32. It just worked - I was astonished! It does use the Arduino IDE tool chain, so that may be why it was so seamless as I had already got everything working with that first. If IntelliSense complains about a missing library, right click on the light build and edit the c_cpp_properties.json file which contains the include path.




The other thing you will probably have to do is add:

"output": "../build",

To the .vscode/arduino.json file, which can be found under the work space for your sketch. The same location as the c_cpp_properties.json file.  If output is not set, Arduino will create a new temporary output folder each time it compiles your sketch, which means it cannot reuse the intermediate result of the previous build, leading to long verify/upload time. So it is recommended to set the field. Arduino requires that the output path should not be the workspace itself or subfolder of the workspace, otherwise, it may not work correctly. By default, this option is not set. Again, no idea why, you will get a warning if it isn't set when you verify.



Whether you use the Arduino IDE or Visual Studio Code, you need to download the Azure IoT library: ESP32_AzureIoT - An Azure IoT Hub library for ESP32 devices in Arduino.  Unzip and copy this to your Arduino libraries folder.

As a first test, load the GetStarted.ino sketch from the examples folder in the library you just downloaded. This sketch will connect to the IoT hub and continuously send messages containing fake data. You will need to fill in the following blanks in the sketch:

  • DEVICE_ID - copy from your IoT registered device;
  • connectionString - copy from the primary connection string;
  • ssid - the displayed name for your WiFi network; and
  • password - for your WiFi network.

Connect to your ESP32, check the port and board type, then compile and upload the sketch. Open up the serial monitor at 115,200 baud so that you can see what is happening. The monitor should be displaying something like the following.


To confirm that the IoT hub is receiving these messages have a look at the Azure dashboard and you should see these messages arriving.


If you have Visual Studio Code, you can use the Azure IoT extension to monitor messages to your IoT hub. Just select your device and then right click and Start monitoring D2C (Device to Cloud) message. You can also send messages from the cloud to your device from here.


Once this was working, I updated the code to just send a heart beat message back to the cloud, letting us know that it was still alive. Every time the ESP32 does this, it broadcasts an IR remote control code three times. This is the usual methodology for remote controls. Currently it is just broadcasting the Sony power code, but we will look at ways we can start/stop the broadcast and change the code via Azure.

Cloud to Device Message Lifecycle




To guarantee at-least-once message delivery, IoT Hub persists cloud-to-device messages in per-device queues. Devices must explicitly acknowledge completion for IoT Hub to remove them from the queue. This approach guarantees resiliency against connectivity and device failures.

When the IoT Hub service sends a message to a device, the service sets the message state to Enqueued. When a device wants to receive a message, IoT Hub locks the message (by setting the state to Invisible), which allows other threads on the device to start receiving other messages. When a device thread completes the processing of a message, it notifies IoT Hub by completing the message. IoT Hub then sets the state to Completed.

The max delivery count property on IoT Hub determines the maximum number of times a message can transition between the Enqueued and Invisible states. After that number of transitions, IoT Hub sets the state of the message to Dead lettered.

The diagram above shows the lifecycle state graph for a cloud-to-device message in IoT Hub. Luckily, sending messages is a lot more straight forward than implementing the message lifecycle.

Controlling the ESP32 via Azure


Now that we have our ESP32 talking to Azure and broadcasting an IR code burst every 10 seconds we want to be able to control this via the cloud. The easiest way to do this is using Visual Studio Code again.


If you right click on the device, shown in Explorer under Azure IOT HUB DEVICES, then the window above is displayed. The two ways we will look at communicating with our device via the cloud is:

  1. Cloud to Device (C2D) Messaging; and
  2. Triggering a defined device method.
You can try out both.

Cloud to Device Messaging


After uploading the sketch to your ESP32, open the serial monitor from the Arduino IDE. I found the serial monitor function in Visual Studio Code was a bit dodgy. Select Send C2D (Cloud to Device) Message to Device, and a message entry window will open. Type in whatever you want and hit return.

In the Azure IoT Toolkit output window you should see:

[C2DMessage] Sending message to [ESP32_IRBeacon_1] ...
[C2DMessage] [Success] Message sent to [ESP32_IRBeacon_1]

A second or so later, the following will appear in the Serial Monitor:

Info: >>>Received Message [1], Size=14 Message test message
Message callback:
test message

The function which handles message handling in the ESP32 code is MessageCallback(const char* payLoad, int size). The call back function is set during setup() using:

Esp32MQTTClient_SetMessageCallback(MessageCallback);


Invoke a Direct Method


This is the way that I chose to control the IR beacon (since that is its purpose). You can define what methods you want to support in your code. Currently we are only handling start and stop but it would be trivial to add another method to set the IR code transmitted.

The process for invoking a method is the same as for sending a C2D message. Right click on your device in Explorer and select "Invoke Direct Method". A text entry window will open, type in your method name (e.g. stop) and hit enter. In the Azure IoT Toolkit output window you should see:

[DirectMethod] Invokeing Direct Method [stop] to [ESP32_IRBeacon_1] ...
[DirectMethod] Invokeing Direct Method [start] to [ESP32_IRBeacon_1] ...

Yes whoever wrote this code couldn't spell invoking! Then in the Serial monitor:

Info: Try to invoke method stop
Info: Stop sending IR burst and heart beat

The function which handles method handling in the ESP32 code is DeviceMethodCallback(). This call back function is set during setup() using:

Esp32MQTTClient_SetDeviceMethodCallback(DeviceMethodCallback);


Conclusion




If all you want is a dashboard for your IoT application then Node-Red is MUCH simpler to implement. If you need an industrial strength solution then you need something like Azure IoT hub.

Actually, using Visual Studio Code (VSC) was a pleasure, and this will be my go to IDE for Arduino from now on. Controlling and Monitoring your IoT devices via VSC was also very easy once you work out how it operates. The user interface is not very discoverable, you need to hit F1 to access most of the Arduino and Azure commands.

I will publish a short follow up article providing a simple PCB to mount the diagnostic LED's, IR transmitting module and the ARM/DEBUG switch. It is a simple enough circuit that you could do it on a breadboard or veroboard. Note that the Duinotech ESP32 doesn't leave any pins free on one side of a standard breadboard, due to its width. See the photo above. This is one reason I decided to use a PCB.