A complete guide to displaying and normalizing location data in React Native

This is part two of a two-part series. In Part 1 we looked at the options available for recording a user’s location. We introduced React Native’s pure approach, Expo’s API, and and the react-native-background-geolocation package, as three approaches for gathering a user’s location data.

Part 2 - Displaying a user’s location on the map and normalizing location data

Raise your hand if you’ve ever been in a situation where you you’ve gotten into an argument with your significant other over the route you both should take to get somewhere faster. And you had a route that was shorter than theirs, but they just didn’t want to believe you? Well now you can finally win that argument, stay with me.

In Part 1 I created an app which records my own location data. But now I want the app to display the data on a map and get rid of any noise in the collected data. Fortunately, I can use Expo and React Native which are tools that I use at OK GROW! that equip me with various libraries for making an app like this, without having to write any native code.

In this post, we will discuss displaying a user’s location on a map in a React Native app, as well as strategies for reducing noise in the recorded data to make the lines on the map look smoother. We will explore a native solution, Google’s Snap to Road API, and The Kalman Filter as potential solutions. Next time you find yourself in an argument about the shortest route, you will be able to show your significant other (or even your pet dog) the best route clearly displayed on the map.


All of the code for this blog post can be found here

Displaying the location on the map

You can use Expo’s Map API which wraps Airbnb Maps to display your location data on the map. We’re not going to explore the details of implementing a map view, but you can check out these awesome posts by Kirsten Swanson: React Native Google Map with react-native-maps and Ali Oğuzhan Yıldız: React Native Maps with Google Directions API for implementation details.

Why is our data noisy?

If the data that you are displaying on the map was collected from a user’s phone, your data is going to be noisy. This is fine for example if your app lists the nearest restaurants. A small error in the accuracy of a user’s location can mess up the sorting order of the closest restaurants, but it’s not a deal breaker.

On the other hand, if you try to draw a user’s travelled path on the map, a small amount of noise can cause obvious zigzags on the map rather than a smooth line.

Consider the image below. I used the data from a walk I took starting from my house, stopping at a coffee shop, and grabbing food at the Roti House, as an example. As you can see, the data is extra noisy because I used react-native-background-geolocation on its lowest accuracy setting (highest battery efficiency). If I had set my desired accuracy to maximum, I’d probably not have as much noise in my data (even though I’d still have some).


Even on the highest accuracy setting, you’ll get noise in your data every once in a while. Your phone uses various sources to determine your location, in order of highest to lowest accuracy – these are GPS, Wi-Fi, and cellular data. Although these sources are not human, they are still allowed to make mistakes! While the noise or mistake might be negligible depending on your use case, if you are trying to plot the received coordinates on the map, your noisy data can really confuse your user.


Google’s Snap to Road API

Google provides an API that takes your noisy data, and matches all of its location points to roads and sidewalks. This is great if you’re making an app that is used mostly by bikers and drivers. However, if some of your users pass through parks or mountains, this API will not be very effective.


  • It’s made by, and field tested by, Google which means it works very reliably
  • Does not take time elapsed between points into consideration


  • Only works on roads. Does not work for normalizing your location when you’re walking your dog in a park or jet skiing across the Pacific ocean
  • Only normalizes 100 coordinates at a time
  • It costs $0.50 for every 1000 extra requests after the first 2000 requests

Naively removing anomalies

Location is reported from the user’s phone along with estimated accuracy in meters. This means we can reduce the noise by getting rid of all the locations with accuracy values lower than a certain threshold. Also since we know the time elapsed between each reported location, we can use the Linear least squares method to get rid of anomalies in terms of calculated speed and acceleration.

This method would be simple to implement, but may not work well for you if your data is very noisy, or if your app is set up in a way that you have to zoom-in on the map a lot. A more powerful algorithm is Kalman’s method.


  • Easy to understand and implement


  • Not as effective as the other two methods
  • Only removes invalid points, still leaving the path not very smooth


Kalman’s method

When trying to smooth any kind of noisy information such as GPS data, Kalman’s filter is the standard approach. Kalman filters are very frequently used to smooth noise in data economics and engineering. They are fast, lightweight on memory and provide you with a lot of knobs to turn to fine-tune your final result.


  • Powerful algorithm that works for any noisy data
  • Fast and lightweight
  • Various modifiable constants to control how the algorithm works


  • Relatively harder to understand

How do Kalman filters work?

Consider an array of locations that has been reported from the user’s phone. Kalman is a recursive algorithm that does the following:

  1. Looks at the next element of the array

  2. Predicts where the next element should be based on a weighted average

  3. Corrects the point based on the prediction

  4. Updates the weighted average.

Notice that the algorithm remembers something about the previous points in the array, but not the entire array. This is what makes this algorithm fast and lightweight. It’s because it only looks at two points at a time rather than the entire array. Before reading the rest of the post, take a moment and think about what could go wrong if you only consider two points at a time rather than all of them?


As you can see, if the first point is incorrect, Kalman does not fix it. One solution would be to mix this approach with the naive solution and filter out points with low accuracy before running the data through Kalman. Keep in mind that there are other implementations that don’t have the said limitation.

The math behind the Kalman algorithm might be intimidating at first, but using it is actually fun and simple. Most implementations allow you to fiddle with constants that make the filter more or less forgiving. These constants correlate to process noise and measurement noise. Process noise is harder to conceptualize and is often ignored. In our example measurement noise is the error that the phone’s GPS makes. To understand how Kalman Filters work and what constants are available, check out Tim Babb’s really good explanation of the concepts and good implementation example of a Lightweight JavaScript library for Noise filtering using Kalman filters in Wouter Bulten’s blog post.

Make sure to play around with different constants to find the sweet spot for your specific data. You can also generate the value of the constant dynamically based on your data. The gif below demonstrates how changing the constant can affect your data.


At OK GROW! my brilliant co-worker Mohamad Mohebifar implemented it in JavaScript like the following, which was inspired by this Java implementation:

const refineLocation = (location, lastLocation, measurementNoise) => {
  const accuracy = Math.max(location.accuracy, 1);
  const result = { ...location, ...lastLocation };

  if (!lastLocation) {
    result.variance = accuracy * accuracy;
  } else {
    const timestampInc = location.timestamp.getTime() - lastLocation.timestamp.getTime();

    if (timestampInc > 0) {
      // We can tune the velocity and particularly the coefficient at the end
      const velocity = _calculateGreatCircleDistance(location, lastLocation) / timestampInc * measurementNoise;
      result.variance += timestampInc * velocity * velocity / 1000;

    const k = result.variance / (result.variance + accuracy * accuracy);
    result.latitude += k * (location.latitude - lastLocation.latitude);
    result.longitude += k * (location.longitude - lastLocation.longitude);
    result.variance = (1 - k) * result.variance;

  return {
    ..._.pick(result, ['latitude', 'longitude', 'variance']),

So, which approach should I pick?

If you are displaying data gathered from a sensor, it is going to be noisy. If your data is someone’s location as they travelled through roads and streets, Google’s Snap to Road API is the way to go. Otherwise, Kalman filters can be used to normalize any kind of data, including location.

You can find the rest of code above for the Kalman approach and other approaches here at: https://snack.expo.io/@arrygoo/locations-blogpost


Let's stay connected. Join our monthly newsletter to receive updates on events, training, and helpful articles from our team.