Nature smart cities


How can we leverage latest advances in IoT and edge computing to design and develop smart sensors that monitor bat activity in real-time across a large diverse urban environment? 



UCL, Bat Conservation Trust, London Wildlife Trust and Arup


Over 300,000 bat calls have been detected since deployment at the start of June, with an average of 7000 bat calls per night and a maximum of 20000+ calls recorded in one evening.


Calculations show that for the network of 15 smart bat monitors, machine learning algorithms at the edge of the IoT network reduce data transfer to the cloud from 180GB per day to 2.1MB per day.


There has been extensive press coverage including a national television live interview (BBC Breakfast), BBC online and radio coverage including BBC R4 Today program, BBC R4 Inside Science program, 5Live Breakfast show, BBC London news.

Nature Smart Cities brings together environmental researchers and technologists to develop the world’s first end-to-end open source system for monitoring bats, to be deployed and tested in the Queen Elizabeth Olympic Park, east London.

Bats are considered to be a good indicator species, reflecting the general health of the natural environment – so a healthy bat population suggests a healthy biodiversity in the local area. In this project we are exploring bat activity in one of the most iconic and high profile of London’s regeneration areas, the Queen Elizabeth Olympic Park. We have developed a network of 15 smart bat monitors and installed them across the park in different habitats. It is hoped that this exploratory network of devices will provide the most detailed picture yet of bat life throughout this large urban area.

Each smart bat monitor – Echo Box – works like “Shazam for bats”. It captures the soundscape of its surroundings through an ultrasonic microphone, then processes this data, turning it into an image called a spectrogram. Deep learning algorithms then scan the spectrogram image, identifying possible bat calls. We are also working towards identifying the species most likely to have made each call.

Measuring bat activity in the Queen Elizabeth Olympic Park provides a very interesting real-world use case that involves large amounts of sensor data – in this case acoustic data. Rather than sending all of this data to the cloud for processing, each Echo Box device will process the data itself on its own chip, removing the cost of sending large amounts of data to the cloud. We call this “edge processing” since the processing is done on devices at the edge of the network. 

Inside each Echo Box is an Intel Edison with Arduino breakout, plus a Dodotronic Ultramic 192K microphone. To capture, process and identify bat calls each Echo Box performs the following 4 steps:

First – a microphone on each device, capable of handling ultrasonic frequencies, can capture all audio from the environment up to 96kHz. Most bats calls occur at frequencies above 20kHz (the limit of human hearing) with some species going as high as 125kHz (although none of these species are found in the park).

Second – every 6 seconds, a 3 second sample of audio is recorded and stored as a sound file. This means that audio from the environment is captured as 3 second snapshots at a consistent sample rate across all smart bat monitors.

Third – the recorded audio is then turned into a spectrogram image using a method called Fast Fourier Transform. The spectrogram image shows the amplitude of sounds across the different frequencies over time. Bat calls can clearly be seen on the spectrogram as bright patterns (indicating a loud noise) at high frequencies.

Finally – image processing techniques, called Convolutional Neural Networks (CNN), are applied to the spectrogram images to look for patterns that resemble bat calls. If any suspected bat calls are found in the image, then we are working towards applying the same CNN techniques again to each individual bat call to look at its shape in more detail and determine what species of bat it most likely is.