Newsletter signup

Complete details

Please enter your first name and last name below to complete your newsletter registration.

Loading
Newsletter Thank You

Thank you

Catapult Satellite Applications

Thank you for signing up to our newsletters.

You can unsubscribe at any time by contacting us at marketing@sa.catapult.org.uk

Welcome

Please enter your email address and postcode below.

Loading
Catapult Satellite Applications

+44 (0)1235 567999     info@sa.catapult.org.uk     Harwell Campus, OX11 0QR

Your information

Your information

Your details

Organisation details

Which facility are you interested in?

Your enquiry

Event confirmation

Newsletter Thank You

Thank you

Catapult Satellite Applications

Thank you for your enquiry.

Your request has been processed and you will receive a confirmation email shortly.

If you require immediate assistance, please don't hesitate in calling us on +44 (0)1235 567999.

    Details regarding the processing of your personal data can be found in our privacy policy.

Menu
Share this page

RiR Week: Mapping Terrestrial Objects in Satellite Imagery with Machine Learning

By on

Forget the needle – finding a haystack in a satellite image is hard! Or even a whole farmstead, especially when it is a brown fence surrounding a 100m2 brown patch of earth across huge swathes of brown savannah in Kenya.  

Figure 1: A typical boma in Samburu, Kenya

Locating these farmsteads, called ‘bomas’, is key to reducing human-elephant conflict, saving elephant’s lives as well as reducing farm raids.  Machine learning, specifically Deep Learning, is showing great promise in automatically marking bomas in optical satellite imagery.  Automation is critical when a 20,000 km2 area has to be mapped about four times every year.

We are working closely with Save the Elephants, or STE, to develop boma mapping software and solving some hard machine learning problems to reduce the human mappers’ workload.  This is one of several machine learning co-creation projects we are undertaking for environmental and natural disaster applications. Co-creating solutions with end-users such as STE is an important way of checking that we are addressing pertinent problems. Our software uses a combination of human-labeling (i.e. crowd-sourcing) and machine labeling to map a range of objects in satellite imagery that have been requested by different organisations, including bomas, elephants, tailings dams, cement plants, seismic faults, blocked roads, and damaged buildings following hurricanes and earthquakes as well as track the impact of the weather on crop health.

Our research focusses on the development of novel methods for ‘human-agent collectives’ – a fancy name for humans and computers working together towards a common goal, conveniently exploiting the intelligence of humans and the pattern matching skills of computers to map satellite imagery as accurately, quickly and cheaply as possible. The solution is a combination of Deep Learning neural networks and traditional Bayesian methods, that together process high dimensional, multi-spectral datasets, find subtle patterns in them, and provide an interpretable means of combining data sources, such as human labels. The overall impact of these technologies is far-ranging, they reduce data costs, generate more informative results, and discover new patterns in data that drive new science.

Co-creation is not just about product delivery. When results are not critical, the best part of machine learning is when it doesn’t work. We’re always up for a research challenge and understanding why existing methods don’t work and finding novel solutions and discovering new connections and theoretical relationships is our bread and butter. Working with industry and NGOs provides an opportunity to identify and frame these research challenges and ensures the research is worthwhile.

We are keen to work with organisations to co-create machine learning solutions to environmental and disaster management problems.  However, we are not constrained to working with satellite imagery. Our recent work, in response to COVID-19, has not used satellite imagery at all.  Working with local health authorities we have been co-creating software to predict local surges in Covid-19 cases from mobile phone tracking data. No, this is not the ‘track and trace’ everyone has heard of. This is a machine learning algorithm that learns the relationship between case numbers and local population movement and then predicts when the case numbers will rise.  

I am in the enviable position of running a machine learning research group at the Engineering Science department at Oxford University. We are always on the lookout for non-academic partners who can help deliver high impact solutions to problems in society and the environment. Please get in touch if you would like to discuss collaboration opportunities: reece@robots.ox.ac.uk

Personal web page at the Alan Turing Institute

Personal web page at Oxford University


Research in Residence Q&A Session – Friday 11 September

If you would like to find out more about all our Research in Residence projects, ask questions, and connect with our academics, join us on Friday 11 September, 10.30am – 12:00pm, when we will be hosting a live Q&A session.

Click here to register


Steven Reece
Researcher in Residence
Steven Reece is a Departmental Research Fellow in machine learning in the Engineering Science Department, Oxford University, a Turing Fellow at the Alan Turing Institute, and an EPSRC funded Researcher in Residence at the Satellite Applications Catapult.

// Sign up for our newsletters

Keep up to date with our latest news, upcoming events and latest opportunities by signing up to our newsletters.