Let our previous work inspire you to take your next step. Transforming business through data is what we do, so how will we do this for you?

Natural Language Processing

67% time saved with Active Learning

Our client needs to screen research papers and select the ones that are relevant to a specific subject. Usually out of thousands of papers, only several hundred are actually useful. Doing this manually takes a lot of time and is not fun to do, so the client asked us:
Can AI help reduce the workload?
This is a hard and intricate problem. To both reduce the amount of papers screend by hand by the researcher, but to still retrieve all relevant papers, requires the AI to have some notion of 'certainty' in its judgement of relevance.

Also the end user should review not too many, but most definitely not too few of these papers by hand. Figuring out how to even determine what amount of the review should be done by hand made this a very interesting and difficult challenge.

So how did we do this?

For this we use a technique called Active Learning: this type of AI learns from human input, and can improve itself constantly. The researcher only has to review a fraction of the thounsands of papers, but this is enough information for the AI to retrieve all relevant ones.

This way, the AI learns what relevant papers are, and can help to reduce the workload of the researcher by 67%!

Our client is using this solution, and we are working with them on making this technology available as a commercial product.


Labeling hundreds of thousands medical case reports automatically

Hospitals in the Netherlands are required to assign universal codes to diagnosed patients, in order to report to the Health Authority (NZa) and Statistical Bureau (CBS). The hospitals hire specifically trained staff to do so, making this process expensive and unfortunately prone to errors. Our client DHD wanted to find a way to support hospitals and labelers in this laborious task.

What did we do?

  • Trained an AI model that assigns the right label to the right case
  • Determined how this model can be implemented to actually assist the manual coding effort

Our AI model was able to automatically assign many of the codes correctly. When limited to cases where the certainty was over 90%, a requirement from the end users, the algorithm could by itself assign over 30% of the codes. (A very promising result for an initial study).

For cases where the AI's certainty was below this very high threshold, the algorithm can give a top-5 of likely codes that match the case, and so still drastically reduces the time required to manually assign the label.

The client will, together with hospitals and us, now move forward to improve this AI model and implement it where we can.


Predict the visitors of a zoo

How full will the park be? How many guests can we expect this week? The client wanted to be able to predict what the busy and slow hours will be for coming weeks.

What did we do?

  • Combined their historical data with datasets like the weather, holidays, events around town, etc.
  • Examined which factors had the largest impact on visitor numbers
  • Built and trained an AI model that predicts the hourly amount of visitors
  • Represented the historical data, the predictions and the influencing factors in a sleek dashboard

This AI is able to make predictions for how many visitors the park will receive by hour, two weeks in advance. These numbers are split by the type of guest, like card holders – who enter for free, tourists from abroad or groups of schoolchildren.

The client can now inform its guests much more effectively about expected crowds, it can adopt a dynamic pricing model, optimise procurement and improve staffing and rostering.

Strategy Case Study

Only save the data you really need

Airborne Oil & Gas, a pipeline manufacturer, wanted to equip their factory line with sensors and cameras, to gather data about their production process. This data can be used to model a digital twin, a virtual copy of the factory, and to train AI models to do various tasks.

But before that could happen, they first needed to store vast amounts of data that these sensors would generate. They asked us to determine the needs for storage, what data to acquire and optimize the amount of data stored.

Read all about itkeyboard_arrow_right

Sensordata in clinical research

How can we use wearables in clinical research? Our client frequently does medical research on test subjects and wants to utilise sensor data for clinical studies, and be able to process the huge amounts of data that these sensors generate.

What did we do?

  • Examined sensor data for usability
  • Helped design a study for clinical validation of sensor-based technology
  • Strategic advice in setting up a Big Data environment and tooling

Based on our results a clinical trial was started in which the principal objective is to automatically classify patients of a certain disease from healthy controls.

Together with this client we are developing AI models that can directly diagnose patients from sensor data.


All rooftops in the Netherlands measured and categorised

How high is a building? What type of roof does it have? The client, a data aggregator, wanted a dataset containing the heights and roof profiles of all buildings in the Netherlands.

What did we do?

  • Coupled satellite and open map data and determined the height of every Dutch building
  • Gathered and combined tens of thousands of 'labelled examples' of roof types from various sources
  • Trained an AI model to recognize the most common types of roofs

Our model constructed the required dataset, assigning a roof type to each and every building in the Netherlands. With this new data the client can improve their own models. They use these for instance to predict how likely someone is to move in the near future.