Skip to content
  1. We help small teams achieve big things.
  2. Our Technical Fellow just became a Microsoft MVP!
  3. We run the Azure Weekly newsletter.
  4. We just won 2 awards.
  5. We just published Programming C# 8.0 book.
  6. We run the Power BI Weekly newsletter.
  7. Our NDC London 2020 talk is now available online!
  8. We are school STEM ambassadors.

Creating a data platform capable of monitoring the world's oceans in near-realtime, to prevent illegal fishing & human trafficking.

OceanMind are a not for profit, who provide actionable insights to law enforcement to prevent illegal fishing, human trafficking and illegal salvage operations on war graves and heritage sites.

OceanMind were a Microsoft AI for Earth Grantee in 2019, and have been featured in multiple keynotes over the past year, including Future Decoded 2019, and the 2019 Web Summit.

Using vessel telemetry from around the world, they employ complex geospatial analysis and Machine Learning to detect these illegal activities.

The challenge with fisheries, particularly on the global scale, is the sheer amount of data. This...really has been a game changer for our organization. In the past we’ve been very batch oriented, so we’ve only had a limited scale that we’ve been able to apply to the problem. Now we can get more results, more quickly, and save our analysts time.

Being a not-for-profit and a small team, OceanMind needed a cost-effective way to enable their team to spend less time on system maintenance and more time on improving the analytical techniques and maximising their impact. Alongside this, they wanted to move away from batched overnight processing towards more real-time analysis. This would allow them to decrease the time taken from activity, to detection, to action. It was decided that they needed to migrate their on-premise systems into the cloud.

OceanMind is using the power of AI and harnessing the world's data about vessels on the seas, analyzing them in real time, and sharing the results with the world's governments to improve the sustainability of the planet's oceans. It shows what technology can do. It shows that we each have a role to play, if we're going to address the great problems that the world puts in front of us.

This was where we came in.

Over the course of three months we designed and built a cloud-facing serverless architecture. We used Azure Functions and Durable orchestration to find ships which had been in contact for a certain amount of time. These events were then used to power the Machine Learning algorithms and detect illegal fishing and other activities.

We introduced secure data structures which protected and segragated the data according to government and compliance regulations. We also employed high-performance compute to increase performance and reduce cost. Using our benchmarking tools we analysed the solution and found that, extrapolating our extensible architecture to support their remaining workloads, the estimated compute cost for the solution came to less than £10 / month.

I love hearing about the way a really small organisation is using technology to scale well beyond the size of the group of the people they have. When people look at the work of OceanMind they think they are looking at the work of a large multinational organisation, but instead it's a small team of dedicated, hardworking individuals.

As a result, OceanMind are now able to carry out their analysis in close to real time. Due to the reliability introduced by the cloud, far less time is spent on fire-fighting. This allows more time for innovation and exploration of the huge wealth of Microsoft Machine Learning technologies. They are also able to operate well within their budget constraints due to the cheap running cost of the serverless solution we designed. In this way endjin took a complex problem and constructed a solution which was optimised for the customer's specific needs.

OceanMind used our High Performance blueprint, designed for your most demanding data workloads.

Azure High Performance blueprint
Building a proximity detection pipeline

Building a proximity detection pipeline

Carmel Eve

At endjin, our approach focuses on using scientific experimental method to support the creation of fully proved and tested decision making, and the use of scientific research to support our work. This post runs through how we applied that process to creation a pipeline to detect vessel proximity.This is an example which is based around the project we recently worked on with OceanMind. In this project we helped them to build a #serverless architecture which could detect vessel proximity in close to real time. The vessel proximity events we detected were then fed into machine learning algorithms in order to detect illegal fishing!Carmel also runs through some of the actual calculations we used to detect proximity, how we used #data projections to efficiently process large quantitities of incoming data, and the use of #durablefunctions to orchestrate the processing.
Wardley Maps - Explaining how OceanMind use Microsoft Azure & AI to combat Illegal Fishing

Wardley Maps - Explaining how OceanMind use Microsoft Azure & AI to combat Illegal Fishing

Jess Panni

Wardley Maps are a fantastic tool to help provide situational awareness, in order to help you make better decisions. We use Wardley Maps to help our customers think about the various benefits and trade-offs that can be made when migrating to the Cloud. In this blog post, Jess Panni demonstrates how we used Wardley Maps to plan the migration of OceanMind to Microsoft Azure, and how the maps highlighted where the core value of their platform was, and how PaaS and Serverless services offered the most value for money for the organisation.
Optimising C# for a serverless environment

Optimising C# for a serverless environment

Carmel Eve

In our recent project with OceanMind we used #AzureFunctions to process marine vessel telemetry from around the world. This involved processing huge quantities of data in close to real time. We optimised our processing for a #serverless environment, the outcome of which being that the compute would cost less than £10 / month!This post summarises some of the techniques we used, including some concrete examples of optimisations we made.#bigdata #dataprocessing #dataanalysis #bigcompute
High-performance C#: a test pattern for ref structs

High-performance C#: a test pattern for ref structs

Ian Griffiths

C# 7.2 introduce ref structs, a new kind of type (Span is a ref struct) designed to support certain high performance scenarios. There are constraints around their use, and when writing unit tests for our Ais.Net parser, this caused some challenges. This blog describes the technique we used to work around the constraints.

We help organizations of all sizes from start-ups to global enterprises across financial services, media & comms, retail & consumer goods, and professional services.