Skip to main content

Interactive Spatial Data Operations in Fabric SQL

· 8 min read
James Dales
Co-founder of Tekantis

With SQL Server now in public preview inside of Microsoft Fabric, I was keen to investigate it's spatial data functions. Whilst the Event House has provided spatial data processing inside of Fabric since the outset with KQL database, I was excited when the SQL engine arrived as it's a more familiar environment to those working with spatial data. It also opens up the door for doing realtime processing of that data based on interactions in a Power BI report.

Fabric SQL has strong support for geospatial datatypes and functions opening up a wide range of operations such nearest neighbours, distance calculations, boundary intersections etc etc.

In this example I wanted to keep things simple and show properties within a selected distance from selected postboxes. The postboxes would be selected by a Power BI slicer, as would the distance required. Then on making these selections, SQL server would then find the properties within those search radiuses.

alt text

First of all I needed some data to use, so I extracted from OpenStreetMap the locations of all the postboxes in Buckinghamshire in the UK using the QuickOSM plugin for QGIS. I also did the same for all the buildings in Buckinghamshire. There's only partial data for building outlines in Buckinghamshire, a better datasource for these would have probably been Ordnance Survey data. I saved these two separate data extracts in CSV format with WKT geometry.

So with my data ready, I headed over to Microsoft Fabric, and was quickly and easily able to create SQL database in just 2 clicks. Importing the data was also straightforward using DataFlows Gen 2 to upload the CSV file into OneDrive and import into 2 tables within the database using the familiar PowerQuery interface. The only downside here is that PowerQuery can't create fields of type geography so the WKT data is added as a text field. But this is quickly remedied once the data is loaded.

I also added spatial indexes on the columns containing the spatial data to improve performance.

alt text

With my data loaded I was now ready to create a function to to perform the spatial searches when triggered from Power BI. I opted to use a table function as I can then parameterize the search criteria using PowerQuery parameters - I've not been successful in achieving this with stored procedures.

From Power BI, I pass in a list of comma separated postbox references and the distance in meters to search. The first thing we need to do is process that list of postboxes:

WITH RefList AS (
SELECT TRIM(value) AS ref FROM STRING_SPLIT(@Ref, ',')
)

With that done we can use the STDistance spatial function to calculate the distance between the buildings and the selected postboxes, and return those within the specified distance:

-- Return buildings where the distance from the postboxes is less than @Distance
SELECT
b.full_id,
b.geom.STAsText() AS WKT,
b.building,
'building' AS itemtype
FROM dbo.buildings AS b INNER JOIN dbo.postboxes AS p ON p.ref IN (select ref from RefList)
WHERE b.geom.STDistance(p.geom) <= @Distance

As well as showing the matched properties in the results, I also wanted to show the locations of the postboxes, so we use a UNION ALL to add the postbox locations into the results:


UNION ALL

-- Return a points for the postbox locations
SELECT
full_id,
p2.geom.STAsText() AS WKT,
'' AS building,
'origin' AS itemtype
FROM dbo.postboxes AS p2
WHERE p2.ref IN (select ref from RefList)

And then finally I also wanted to return the search radius to show on the map:

UNION ALL

-- Return the perimeter of the search areas as polygons
SELECT
p3.ref + 'perimeter' AS full_id,
p3.geom.STBuffer(@Distance).STAsText() AS WKT,
'' AS building,
'search_perimeter' AS itemtype
FROM dbo.postboxes AS p3
WHERE p3.ref IN (select ref from RefList)

Here's the complete function:


ALTER FUNCTION dbo.fn_GetBuildingsWithinDistance
(
@Ref nvarchar (4000),
@Distance FLOAT
)
RETURNS TABLE
AS
RETURN
(

WITH RefList AS (
SELECT TRIM(value) AS ref FROM STRING_SPLIT(@Ref, ',')
)

-- Return buildings where the distance from the postboxes is less than @Distance
SELECT
b.full_id,
b.geom.STAsText() AS WKT,
b.building,
'building' AS itemtype
FROM dbo.buildings AS b INNER JOIN dbo.postboxes AS p ON p.ref IN (select ref from RefList)
WHERE b.geom.STDistance(p.geom) <= @Distance

UNION ALL

-- Return a points for the postbox locations
SELECT
full_id,
p2.geom.STAsText() AS WKT,
'' AS building,
'origin' AS itemtype
FROM dbo.postboxes AS p2
WHERE p2.ref IN (select ref from RefList)

UNION ALL

-- Return the perimeter of the search areas as polygons
SELECT
p3.ref + 'perimeter' AS full_id,
p3.geom.STBuffer(@Distance).STAsText() AS WKT,
'' AS building,
'search_perimeter' AS itemtype
FROM dbo.postboxes AS p3
WHERE p3.ref IN (select ref from RefList)
);

This is all the setup required within Fabric SQL, so now over to Power BI Desktop.

I was able to connect to Fabric SQL in the same way as I'd connect to any SQL DB, so that was easy, and I simply imported the postbox table as import mode into my Power BI model. I was't worried about any geometry here, just the reference numbers, although in real life, I'd have imported additional reference data for improved searching and selection.

Now to bring in the building search results. I next made to parameters in PowerQuery used to pass the selected postboxes and the search distance.

alt text

Now we're ready to create the connection to the table function. As we want the searching to be done interactively, we need to ensure the connection to the table function is made using Direct Query. This means that the SQL query is executed and new results returned every time we interact with the slicers (or other visuals) in the report.

After making the initial connection, I then edited the query in the Advanced Editor. This is because we need to add a little code to handle the multi-selection of postboxes. There's likely a better approach for this, but this seems to work well. First of all I handle a single value vs multiple, and then create a comma separated text string with the selected values:

alt text

Note we're using the parameters postbox_ref and range that I configured earlier.

Here's the M code:

let
// Normalize postbox_ref to a list if it's a single string
normalizedList = if Value.Is(postbox_ref, type list) then postbox_ref else {postbox_ref},

// Combine into a single comma-separated string
commaSeparated = Text.Combine(normalizedList, ","),

Source = Sql.Database("yourdatabaseservergoeshere.database.fabric.microsoft.com", "spatialsql-314d84ad-21da-438c-9caf-76f51b7259d9"),
dbo_fn_GetBuildingsWithinDistance = Source{[Schema="dbo",Item="fn_GetBuildingsWithinDistance"]}[Data],
#"Invoked Functiondbo_fn_GetBuildingsWithinDistance1" = dbo_fn_GetBuildingsWithinDistance(commaSeparated, range)
in
#"Invoked Functiondbo_fn_GetBuildingsWithinDistance1"

That's the data connections sorted. Now over to Power BI's report editor. The queries should load in all the postbox references into a table, and the table function should be called using the default values defined in the PowerQuery parameters.

We now need to set up the interactivity part of the report. So I added a slicer based on the post code reference.

alt text

But this won't currently change how the table function is called - we need to link up this to the Power Query function. We do this in the Power BI Model View.

Selecting the postcode reference field on the right, shows the properties panel. We need expand the Advanced section and in the "Bind to parameter" section, select our PowerQuery postbox_ref parameter. And then also enable Multi-Select.

alt text

This then binds this field to the PowerQuery parameter. When we select an item in the slicer now, that value is passed into the SQL query.

We also need to do something similar for the distance selection. For this I created a numeric range parameter:

alt text

alt text

This then provides us with a table of numbers, and a slicer, but we need to bind it to the PowerQuery parameter before we can use it. So back in the Model view, bind it to the range parameter a we did for the postbox_ref parameter, but this time don't enable Multi-select:

alt text

And now we're ready to display the results, which of course I used Icon Map Pro for.

The field configuration is straight forward, I add the full_id field into the ID field, and the geom field into the Image / WKT field:

alt text

Then in formatting settings for Icon Map Pro, in the Data Layers I just enabled WKT / GeoJson (from data):

alt text

And set up some conditional formatting for the buildings vs the postboxes and search radius:

alt text

eg:

alt text

Here's a video showing it in action:

Guide to creating PMTiles

· 8 min read
James Dales
Co-founder of Tekantis

Vector tiles unlock all sorts of possibilities in Icon Map Pro as they allow us to place complex lines and polygons on the map without having to load large files into the report. This means we're not as constrained by the amount of memory we have to play with as a Power BI custom visual - and therefore we can load more data into the visual as a result - we only need to send 2 columns of data to the visual, plus the formatting options.

For example this report contains the details of all 11 million properties in the Netherlands, and we can reference nearly half a million of them at a time on the map, setting the formatting options based on data in the Power BI dataset.

alt text

The challenge has been with vector tiles, is that they need to be generated and hosted somewhere - usually requiring a GIS server such as ArcGIS, GeoServer or a SaaS service such as Mapbox, which may charge by transaction or require setup and hosting. PMTiles however is a cloud-native geospatial format that enables the tiles to be held in a single file which can be hosted in simple cloud storage such as Azure blob storage or an AWS S3 bucket. These are generally cheap storage options, and have no compute cost associated with them.

So how do we go about creating PMTiles? In the coming months, at Tekantis, we'll launch an Azure and/or Microsoft Fabric service to generate them, but in the meantime there are open source tools we can use.

This guide goes through the process of setting up and using the tools for the first time.

The source data

We're going to use the UK's local authority boundaries as our source data. These are obtained from the Office for National Statistics' Geoportal. You'll see me use these files a lot in demos and there are number of options for each file.

alt text

Often I'm loading them as Esri shape files into the visual, so I'll chose the "generalised clipped" version, which Icon Map Pro will go on to further simplify. The challenge with that is that by simplifying the boundaries we lose some of the detail. For example, where I live, the local authority boundary runs down the centre of the road, and depending on which side of the road you live, children are eligible for free school transport - or not - as the different local authorities have different eligibility criteria. Therefore for this example I'm going to download the full clipped version and download it in GeoJSON format - which will already be using Longitude and Latitude coordinates, rather than British National Grid coordinates. This file is over 190mb, so far too large to upload into the visual without using vector tiles, but it has the most amount of detail - ensuring that those precise boundaries are preserved.

alt text

Install Tippecanoe

I've found the best tool for creating vector tiles is Tippecanoe - originally created by Mapbox. Recently Felt created a fork of Tippecanoe that natively supports PMTile generation.

For Power BI developers who will be typically using a PC, Tippecanoe an causes an issue as it doesn't run on Windows - it runs on OSX or Linux.

It is possible to run Linux alongside Windows using the Windows Subsystem for Linux (WSL). Microsoft have a comprehensive guide to setup, but here's the short version:

Run cmd or Powershell as administrator (Press the Windows key and R, type cmd and right click on Command Shell. Then select 'Run as Administrator').

Then within the command shell, type

wsl --install

This will install Linux using the Ubuntu distribution:

Once installed, start Linux and we're ready to install Tippecanoe.

First though we need to update the compiler:

sudo apt-get install gcc g++ make libsqlite3-dev zlib1g-dev

alt text

then

make

then

make install

With that done we can now install Tippecanoe:

$ git clone https://github.com/felt/tippecanoe.git
$ cd tippecanoe
$ make -j
$ make install

The good news is, you don't need to do these steps again!

Generate the PMTiles file

Now we have all the prerequisites installed we can create our tiles. We can copy our geojson file from the downloads folder into the Linux file system. In windows explorer (Windows key and e) enter \\wsl$.

alt text

This will show the linux file system.

alt text

Navigate to the home directory, then the folder for your user name, and then tippecanoe.

Copy your file here.

Back in Linux, we're ready to run Tippecanoe. We're going to create a file called localauthorities.pmtiles in a layer named lad using our source file Local_Authority_Districts_December_2022_UK_BFC_V2.geojson

Run the following command:

tippecanoe -zg --projection=EPSG:4326 -o localauthorities.pmtiles -l lad Local_Authority_Districts_December_2022_UK_BFC_V2.geojson

Tippecanoe will now process the file and generate the PMTiles output file.

alt text

Back in windows explorer you should be able to see the newly generated file. You may need to press F5 to refresh Windows Explorer:

alt text

You'll notice that the generated file is around 15mb, 175mb smaller than the original.

Hosting the file

Now we've generated the PMTiles file we need to host it somewhere. I'm going to use Azure Storage.

You can create a new Storage Account by clicking the new button in the Azure portal:

alt text

Provide a name for the storage account, and the remaining options should work as the defaults.

Once created we need to configure a couple of bits.

Firstly enable anonymous access to the blobs. (It is possible to configure secured access via SAS tokens)

alt text

And then configure CORS:

alt text

Create a row with Allowed origins set to * and the Allowed Methods set to "Get" and "Head".

Save these settings.

Now you can create a container to store the PMTiles file:

alt text

Click on the container to enter it.

Click the upload button to upload the PMTiles. Remember you can type \\wsl$ to navigate to the linux file system. Your file will be in home\username\tippecanoe

Once uploaded, click on it and copy the URL:

alt text

In my case the URL is:

https://iconmapproexamples.blob.core.windows.net/examples/localauthorities.pmtiles

Check the layer has worked

We can use the PMTiles Viewer tool to check that it has worked.

Navigate to https://pmtiles.io/ in your browser

alt text

and paste in your URL and hit Load URL.

You should now see your layer displayed:

alt text

Enabling "show attributes" enables you to see the properties related to each shape as you mouse over them.

alt text

Use the PMTiles layer in Power BI

Now in Power BI Desktop, add Icon Map Pro to your report canvas.

I have a dataset with the local authority codes and names in it - these match those from the original GeoJSON file.

I've dragged the code field into both the ID and Feature Reference fields in Icon Map Pro:

alt text

This will be used to create the join between the Power BI data and the properties in the vector tile layer.

Now in the settings, firstly enable WebGL Rendering. PMTiles are only available with this enabled.

alt text

Enable Vector Tiles in Data Layers:

alt text

In Vector Tiles - Setup, enable 'Use PMTiles' and 'Auto Detect Bounds'

alt text

Paste in the URL of the PMTiles layer:

https://iconmapproexamples.blob.core.windows.net/examples/localauthorities.pmtiles

alt text

Your report should now show the layer. Shapes that have matched the Power BI data will be shown in blue. Those that have not been matched to Power BI data will show in grey:

alt text

Icon Map Pro will attempt to join the Power BI data using the data in the Feature Reference field. It tries to match the values here against the values in each of the properties in the vector tile layer's shapes. This can sometimes generate false positives, so you may want to specify a specific property to match against:

alt text

I've also chosen here to remove features that haven't been matched, so the grey shapes are not shown.

alt text

You can now go ahead and apply conditional formatting as normal to format the shapes.

Additional notes

You may find that when generating small shapes, these aren't shown correctly at low zoom levels. There is guidance on how to deal with thin within the documentation.

I thoroughly recommend reading more on PMTiles concepts and background.

Kudos and full credit to Brandon Liu for developing the PMTiles format.

Geocoding with PowerQuery

· 3 min read
James Dales
Co-founder of Tekantis

Unlike the AzureMaps and ArcGIS Power BI visuals, Icon Map Pro doesn't have the ability to geocode data - to take a postal address or placename and turn it into a longitude and latitude to display on the map. This was an intentional decision for a number of reasons:

  1. Doing this on the fly creates a performance hit before displaying the data, and it has to be done for each user browsing the map.
  2. For security - many of our customers don't want the visual to be able to process data remotely.

A better approach is to geocode the data up front. You can then control where this process happens, and do it once as part of the data ingestion, meaning there is no delay or workload duplication.

Here's a guide to using the Azure Maps geocoding API in PowerQuery to convert addresses to longitude and latitude coordinates.

In this example I have a table of London hospitals and their addresses.

alt text

To add the geocoding functionality into PowerQuery I'm going to add a function.

Add a New Source as a Blank Query:

alt text

then click the Advanced Editor button on the toolbar.

alt text

alt text

Replace the default contents with this M code:

let
// Function to call the Azure Maps Geocoding API
CallAzureMapsGeocodeApi = (ApiKey as text, Address as text) as record =>
let
// Define the API endpoint
ApiUrl = "https://atlas.microsoft.com/search/address/json?api-version=1.0&subscription-key=" & ApiKey & "&query=" & Uri.EscapeDataString(Address),

// Make the GET request
Response = Web.Contents(ApiUrl),

// Parse the JSON response
JsonResponse = Json.Document(Response),

// Extract the first result
FirstResult = if List.Count(JsonResponse[results]) > 0 then JsonResponse[results]{0} else null,

// Extract the latitude and longitude
Coordinates = if FirstResult <> null then FirstResult[position] else null,

// Return a record with latitude and longitude
Result = if Coordinates <> null then [Longitude = Coordinates[lon], Latitude = Coordinates[lat]] else [Longitude = null, Latitude = null]
in
Result
in
CallAzureMapsGeocodeApi

Rename your query to Geocoder

alt text

Then click back on your query containing the address data

alt text

On the toolbar, in the "Add Column" section, click the "Invoke Custom Function" button:

alt text

Enter Location as the column name. Select Geocoder as the Function query. Then under ApiKey click the button and select 'Text'.

alt text

Enter your Azure Maps API Key. This is shown as the "Primary Key" in the "Authentication" section within the "Settings" section of your Azure Maps account.

alt text

Then under Address select Column Name and select the address column.

alt text

Click OK.

The function will now be called for the rows in your table:

alt text

Click the expand button in the new Location column header:

alt text

Untick the "Use original column name as prefix" checkbox

alt text

And click ok.

You should now have Longitude and Latitude columns in your table.

Highlight the two columns and change the Data Type to Decimal Number.

alt text

Click the Close & Apply button to load the data into your Power BI dataset.

alt text

You can now configure Icon Map Pro with your geocoded data:

alt text

Updated Clustering Capabilities

· One min read
James Dales
Co-founder of Tekantis

Version 1.0.9.2 of Icon Map Pro is rolling out via Microsoft AppSource this week. Whilst we quietly rolled it out in January in preview, this release pushes one of our most requested features, clustering of circles and images, into general availability.

Clustering Styles

Whilst we already supported the clustering of circles, the options were limited. This enhanced version uses our new WebGL rendering capabilities to cluster both circles and images. It also introduces the ability to cluster based on a category field. This category field is then used to set the colour for the donut and pie chart segments, and the bars on the column and bar chart options. You can even apply the colour values to categories for images so that these can be included in the donut segments.

Tooltips on Clusters

We've also introduced tooltips on the clusters so you can see the breakdown. As you'd expect from Icon Map Pro, there are lots of customisation options for both formatting and behaviour.

This functionality requires WebGL Rendering Mode to be enabled.

PMTiles arrive in Icon Map Pro

· 4 min read
James Dales
Co-founder of Tekantis

We’re submitting the next version of Icon Map Pro to Microsoft today for publishing on AppSource in a couple of weeks. I’m quite excited about this release for a number of reasons, but one of the main ones is our initial support for PMTiles.

I’ve recently been working with a customer who needed to create a choropleth (filled) map for Canada. Canada is a challenge as it has the largest coastline of any country, but really compact population centres, meaning that to represent census boundaries, we need to be able to plot very large polygons with a detailed coastline, whilst at the same time, really small polygons that might only cover a few blocks in a city centre.

alt text

In terms of doing this in Power BI, we’re unable to use a GeoJSON or shape file as they’re either too large – or once simplified, lose too much definition. As an example, the official GeoJSON file for Canada’s dissemination areas is over 1gb – far too large to import into a Power BI report by some orders of magnitude.

My normal recommendation in these cases is to use Well-Known Text (WKT) shapes loaded into the map via Power BI data model. But Power BI has a maximum number of characters limit that, even with the usual workarounds, is not enough to accommodate some of Canada’s largest territories and provinces.

This doesn’t mean it’s not possible to achieve already with Icon Map Pro though. We can use vector tiles to only retrieve the shapes that need to be displayed on the part of the map that we’re looking at, and at a resolution appropriate for the current zoom level. The issue with vector tiles is that they needed to be either pre-processed, and/or served using a GIS server. This could be an enterprise GIS solution such as Esri’s ArcGIS, an online provider such as Mapbox or an opensource server such as GeoServer. All provide workable solutions, but it’s an additional provider, and sometimes significant cost to add to the mix.

Enter PMTiles. PMTiles provides a new cloud-native way to serve vector (or raster) tiles. Designed to use a highly efficient compression algorithm, the resulting PMTiles file is a single file that sits, usually in cloud storage. It requires a server that supports HTTP range requests – supported by Azure Blob Storage and Amazon S3 buckets. Rather than the whole file being downloaded each time, HTTP range requests are used to download just the segment of the file for that tile. And the file sizes are much smaller than the original – meaning storage costs are minimal – and there’s no compute costs. No GIS server required. Canada’s 1gb dissemination areas file is less than 100mb as a PMTiles file.

This Power BI report, uses 4 levels of drilldown to present Canada’s census data from Territory/Province level, through Census Divisions, Census Sub Divisions to Dissemination Areas. As well as simply displaying the file, in Power BI we use the embedded properties to link it back to the data in Power BI data model, and use conditional formatting to create our filled, choropleth map on the fly. We’re able to either drill into each area at a time, or simply view all 75,000 dissemination areas at the same time.

And that's not our only use for PMTiles, we've also added some additional background maps thanks to Protomaps.

To create the PMTiles files used in this report, I used the open-source tool, Tippecanoe, but we’re working on building a Microsoft product range that will incorporate this capability, to make it easy to generate your own PMTiles in future. Watch this space - there's so much more coming!

WebGL Mode – what’s it all about?

· 4 min read
James Dales
Co-founder of Tekantis

In our February release of Icon Map Pro, you’ll notice a new settings option, “Rendering”, so I thought I’d explain in more detail what’s behind this option, and why it’s there.

Let’s start by explaining a bit more about the code libraries that Icon Map Pro uses to draw the map. If you go back to when I first created the original Icon Map visual nearly 10 years ago, there were 2 main opensource libraries used for creating web maps, OpenLayers and a more lightweight library, Leaflet.

The original Icon Map was built on top of Leaflet, which provided great support for most of the features required for mapping in Power BI. It also had a wide range of plugins available which added some of the more advanced capabilities.

At Tekantis, when we started to write Icon Map Pro last year, we didn’t want to provide fewer capabilities than were available in the older Icon Map visual, so again we chose Leaflet to ensure we were able to provide the full range of features.

However, Icon Map Pro is capable of drawing more rows of data than the original visual, which was limited to 20,000, then later 30,000 rows. When drawing large amount of complex objects, or images on the map, Leaflet starts to suffer from performance issues. We therefore started to look at how to overcome this.

A lot has happened in the web mapping world in the last 10 years. Volodymyr Agafonkin, who created the original Leaflet library, started to work for Mapbox and created a new map library that was built using WebGL rendering, using the power of the GPU for better performance. Whilst this library was initially open source, at some point Mapbox forked the library to create version 3, after which a Mapbox license key was required to use it.

The version 2 library became the Maplibre project and development of this library has continued separately to the Mapbox library. It’s this Maplibre library that is the foundation for many other mapping libraries such as the Azure Maps JavaScript API.

In Icon Map Pro we’ve been using a plugin that uses Maplibre on top of Leaflet to provide some of the features such as our vector tiles based background mapping. We’ve also provided some experimental layers using WebGL to provide better performance for rendering large GeoJSON layers for example. However, using this plugin has limitations as each layer using WebGL created a separate instantiation of Maplibre and WebGL, and we therefore started to hit limitations in regard to the number of WebGL instances possible, and in terms of memory.

In the February release of Icon Map Pro, we’re removed the experimental WebGL options (for new maps) and replaced them with a new WebGL rendering option. This now uses a single Maplibre instance to draw all the overlay and data-bound layers, which means we don’t hit the same limitations. It’s also enabled some additional functionality where items overlap on the map, such as combined tooltips.

To enable this, we’ve made it a single choice between using WebGL via Maplibre vs traditional rendering via Leaflet. In this initial release, not all overlay and data-bound layer types are available, but we’re planning to enable all of them going forward.

Currently we’re still using the Maplibre plugin for Leaflet, so all the current controls are available. The downside of this however, is that additional features such as the ability to rotate and tilt the map are not available, as these aren’t supported by Leaflet.

In a future release, we’re aiming to provide an additional option that will only use Maplibre, and we can therefore add these additional capabilities.

User Driven Format Rules

· 2 min read
James Dales
Co-founder of Tekantis

We had an interesting support request come through yesterday from a customer. They were using Power BI's rule based conditional formatting to specify line colours based on values associated with those lines:

alt text

However, the report viewers - the end users - had requested to be able to not only specify their own colours, but also the value ranges at which they take effect.

I was keen therefore to see if we could build that kind of capability using standard Power BI visuals. To achieve it I created a set of tables containing colour values for each of the ranges, and a set of corresponding numeric range parameter tables. The result is that the users can press a Power BI button to show the controls. 3 slider slicers allow them to enter the numeric ranges from which the colours take effect, and I used the new card slicer visual to allow them to pick from a set of colours.

A DAX measure is then used to return the appropriate colour based on the slicer selections and the value associated with the line

Track Colour = 
VAR _TrackValue = AVERAGE(Track[Track Value]) * 100
VAR _Colour = IF (
ISBLANK(_TrackValue), "#000000",
IF (_TrackValue < 'Range 1'[Range 1 Value], MAX ('Colours 1'[Selected Colour]),
IF (_TrackValue < Range2[Range2 Value], MAX ('Colours 2'[Selected Colour]),
IF (_TrackValue < Range3[Range3 Value], MAX ('Colours 3'[Selected Colour]),
MAX ('Colours 4'[Selected Colour])
))))
RETURN _Colour

Here's the final report. Feel free to download the pbix file and deconstruct it.

Drawing routes along roads using the Azure Maps API

· 6 min read
James Dales
Co-founder of Tekantis

Yesterday I published a blog about drawing drive time isochrones on the map using the Azure Maps API called from PowerQuery, all inside of Power BI. Today I've built a similar report, but this time I'm calling the Azure Maps routing API to draw the routes by road between Heathrow Airport and a number of the UK's other major airports.

alt text

Whilst Icon Map Pro has built in options for drawing lines (straight, curved or geodesic) as well as linestrings from GeoJSON or well-known text (WKT), it can't currently draw lines following the best path along roads. This isn't something that can be handled in the presentation layer, especially if traffic, low bridges, width restrictions, etc are to be taken into account.

On this basis, the Azure Maps API seems like a great way to handle this. I've started with a simple table of source and destination locations, from which I call the API.

alt text

To do this I created a PowerQuery function:

let
GetRouteDirections = (apiKey as text, sourceLon as number, sourceLat as number, destLon as number, destLat as number) =>
let
// Define the API endpoint
url = "https://atlas.microsoft.com/route/directions?api-version=2024-07-01-preview&subscription-key=" & apiKey,

// Construct the request body
requestBody = Json.FromValue([
type = "FeatureCollection",
features = {
[
type = "Feature",
geometry = [
coordinates = {sourceLon, sourceLat},
type = "Point"
],
properties = [
pointIndex = 0,
pointType = "waypoint"
]
],
[
type = "Feature",
geometry = [
coordinates = {destLon, destLat},
type = "Point"
],
properties = [
pointIndex = 1,
pointType = "waypoint"
]
]
},
optimizeRoute = "fastestWithTraffic",
routeOutputOptions = {"routePath"},
maxRouteCount = 1,
travelMode = "driving"
]),

// Convert the JSON request body to binary format
requestBodyBinary = requestBody,

// Make the API call
response = Web.Contents(url, [
Headers = [
#"Content-Type" = "application/json"
],
Content = requestBodyBinary
]),

// Parse the JSON response
json = Text.FromBinary(response)
in
json
in
GetRouteDirections

This returns a lot of information, so I created a function to extract out the lines making up the route, as well as the length and durations.

let
ExtractMultiLineString = (responseText as text) =>
let
// Parse the response text into a JSON object
jsonResponse = Json.Document(responseText),

// Access the 'features' array in the JSON response
features = jsonResponse[features],

// Filter the features to find the one with geometry.type = "MultiLineString"
multiLineStringFeature = List.First(List.Select(features, each _[geometry][type] = "MultiLineString"), null),

// Extract additional properties
distanceInMeters = if multiLineStringFeature = null then null else multiLineStringFeature[properties][distanceInMeters],
durationInSeconds = if multiLineStringFeature = null then null else multiLineStringFeature[properties][durationInSeconds],
durationTrafficInSeconds = if multiLineStringFeature = null then null else multiLineStringFeature[properties][durationTrafficInSeconds],

// Convert duration to minutes
durationInMinutes = if durationInSeconds = null then null else Number.Round(durationInSeconds / 60, 2),
durationTrafficInMinutes = if durationTrafficInSeconds = null then null else Number.Round(durationTrafficInSeconds / 60, 2),

distanceInKm = if distanceInMeters = null then null else Number.Round(distanceInMeters / 1000, 2),

// Convert the MultiLineString feature to text
multiLineStringText = if multiLineStringFeature = null then null else Text.FromBinary(Json.FromValue(multiLineStringFeature)),

// Construct the output object
result = [
multiLineStringFeatureAsText = multiLineStringText,
distanceInKm = distanceInKm,
durationInMinutes = durationInMinutes,
durationTrafficInMinutes = durationTrafficInMinutes
]
in
result
in
ExtractMultiLineString

This is enough to display the routes in Icon Map Pro, but some of the routes I'd requested are long, far exceeding the maximum length of a field in Power BI. Whilst the report handles this by splitting these long routes into multiple rows, I decided to convert the GeoJSON returned by Azure Maps to Well-Known Text format, which is slightly more compact.

Again to do this I created a function, which also reduces the decimal precision of the coordinates, to further reduce the number of characters used:

let
GeoJSONToWKT = (geoJsonText as text, precision as number) =>
let
// Parse the GeoJSON text to a record
geoJson = Json.Document(geoJsonText),

// Extract the geometry and its type
geometry = geoJson[geometry],
geometryType = Text.Upper(geometry[type]),
coordinates = geometry[coordinates],

// Function to format a single coordinate with the specified precision
FormatCoordinate = (coord as number) =>
Number.ToText(Number.Round(coord, precision), "F" & Number.ToText(precision)),

// Function to format a single point (lon, lat)
FormatPoint = (point as list) =>
Text.Combine(List.Transform(point, each FormatCoordinate(_)), " "),

// Function to format a list of points (e.g., for LineString)
FormatLineString = (line as list) =>
"(" & Text.Combine(List.Transform(line, each FormatPoint(_)), ", ") & ")",

// Function to format a list of LineStrings (e.g., for MultiLineString or Polygon)
FormatMultiLineString = (lines as list) =>
"(" & Text.Combine(List.Transform(lines, each FormatLineString(_)), ", ") & ")",

// Function to format a list of Polygons (e.g., for MultiPolygon)
FormatMultiPolygon = (polygons as list) =>
"(" & Text.Combine(List.Transform(polygons, each FormatMultiLineString(_)), ", ") & ")",

// Match geometry type and convert to WKT
WKT =
if geometryType = "POINT" then
"POINT (" & FormatPoint(coordinates) & ")"
else if geometryType = "MULTIPOINT" then
"MULTIPOINT " & FormatLineString(coordinates)
else if geometryType = "LINESTRING" then
"LINESTRING " & FormatLineString(coordinates)
else if geometryType = "MULTILINESTRING" then
"MULTILINESTRING " & FormatMultiLineString(coordinates)
else if geometryType = "POLYGON" then
"POLYGON " & FormatMultiLineString(coordinates)
else if geometryType = "MULTIPOLYGON" then
"MULTIPOLYGON " & FormatMultiPolygon(coordinates)
else
error "Unsupported geometry type: " & geometryType
in
WKT
in
GeoJSONToWKT

And then finally, I created one further function to generate a random colour to assign to each route:

let
GenerateDarkHexColorFromSeed = (seed as text) =>
let
// Convert the seed text into a list of character codes
charCodes = List.Transform(Text.ToList(seed), each Character.ToNumber(_)),

// Sum the character codes to create a simple numeric seed
numericSeed = List.Sum(charCodes),

// Generate pseudo-random RGB values using the numeric seed
red = Number.Mod(numericSeed * 37, 256),
green = Number.Mod(numericSeed * 59, 256),
blue = Number.Mod(numericSeed * 73, 256),

// Adjust RGB values to ensure they are dark enough
minBrightness = 100, // Minimum brightness for each channel
darkRed = if red > 255 - minBrightness then 255 - minBrightness else red,
darkGreen = if green > 255 - minBrightness then 255 - minBrightness else green,
darkBlue = if blue > 255 - minBrightness then 255 - minBrightness else blue,

// Convert each component to a two-digit hexadecimal string
redHex = Text.PadStart(Number.ToText(Number.RoundDown(darkRed), "X"), 2, "0"),
greenHex = Text.PadStart(Number.ToText(Number.RoundDown(darkGreen), "X"), 2, "0"),
blueHex = Text.PadStart(Number.ToText(Number.RoundDown(darkBlue), "X"), 2, "0"),

// Combine the components into a hex color string
hexColor = "#" & redHex & greenHex & blueHex
in
hexColor
in
GenerateDarkHexColorFromSeed

I use this to create an SVG image that I show at the end of the lines, and also in the table. The image is created as a DAX measure, so I can assign the colour dynamically:

Destination Image = "data:image/svg+xml,<svg xmlns=""http://www.w3.org/2000/svg"" width=""20"" height=""20"" viewBox=""0 0 20 20""><circle cx=""10"" cy=""10"" r=""10"" fill=""" & SUBSTITUTE( MAX (Routes[Color]), "#", "%23") & """/></svg>"

Here's the final report. Feel free to download the pbix file and deconstruct it. There's hopefully some useful PowerQuery functions inside that can be put to use in other reports. You will need to use your own Azure Maps API Key - set in a PowerQuery parameter.

Fire Station Drive Times with Azure Maps API

· 4 min read
James Dales
Co-founder of Tekantis

Yesterday Microsoft published a blog post about Isochrones in the Azure Maps Route Range API. Isochrones are a great way to represent the amount of time it takes to travel from a point. We've included an isochrone example in our sample Power BI report file since we launched Icon Map Pro six months or so ago.

With Microsoft now supporting isochrones in geoJSON format as part of their API, I was keen to see if I could incorporate this into Power BI's data load processes using Power Query, without needing to use a Python notebook.

To test this out I've built a report that shows the extent that all the fire stations in West Sussex can reach within a 10 minute drive time.

alt text

I started with a list of fire stations, their addresses and the corresponding longitudes and latitudes:

alt text

The plan was to use the coordinates as the start points for the drive time call for the Azure Maps API.

I created a Power Query function to call the API, which I then call for each fire station:

let

CallAzureMapsRangeApi = (ApiKey as text, Longitude as number, Latitude as number, TimeBudget as number, TravelMode as text, Height as number, Length as number, Width as number) as text =>
let
// Define the API endpoint
ApiUrl = "https://atlas.microsoft.com/route/range?api-version=2024-07-01-preview&subscription-key=" & ApiKey,

// Create the request body
RequestBody = Text.FromBinary(Json.FromValue([
type = "Feature",
geometry = [
type = "Point",
coordinates = {Longitude, Latitude}
],
properties = [
timeBudgetInSec = TimeBudget,
travelMode = TravelMode,
vehicleSpec = [
height = Height,
length = Length,
width = Width
]
]
])),

// Define the headers
Headers = [
#"Content-Type" = "application/json"
],

// Make the POST request
Response = Web.Contents(ApiUrl, [
Headers = Headers,
Content = Text.ToBinary(RequestBody)
]),

// Treat the response as plain text
ResponseText = Text.FromBinary(Response)
in
ResponseText
in
CallAzureMapsRangeApi

The function takes in the your Azure Maps API key, the coordinates, amount of time, vehicle type and dimensions of the vehicle as parameters.

Calling this returns a geoJSON file for each fire station:

alt text

Whilst Icon Map Pro can display geoJSON files, it can't display them when the whole file is stored in a field. It can however display geoJSON features stored in a row in your dataset, so I created another PowerQuery function to extract the isochrone polygon, and its centroid from the geoJSON:

let
ExtractFeature = (GeoJsonText as text) as table =>
let
// Parse the input JSON text
ParsedJson = Json.Document(GeoJsonText),

// Extract the "features" array
Features = ParsedJson[features],

// Extract the "geometry" object for the "Polygon" type
PolygonGeometry = List.First(
List.Select(Features, each _[geometry][type] = "Polygon")
)[geometry],

// Extract coordinates for the "Point" type
PointFeature = List.First(
List.Select(Features, each _[geometry][type] = "Point")
),

PointCoordinates = PointFeature[geometry][coordinates],
Longitude = PointCoordinates{0},
Latitude = PointCoordinates{1},

// Convert the Polygon object to JSON text
PolygonText = Text.FromBinary(Json.FromValue([
type = PolygonGeometry[type],
coordinates = PolygonGeometry[coordinates]
])),

// Prepare the output table
Result = Table.FromRecords({
[
Longitude = Longitude,
Latitude = Latitude,
Polygon = PolygonText
]
})
in
Result
in
ExtractFeature

From this we now just have the Polygon feature for each isochrone, and not the whole file.

alt text

This is enough to display the isochrones on the map, but I also wanted to add a circle showing the centroid, and add a label with the fire station name. To do this I added another set of rows for each fire station with the longitude and latitude, and appended it to the fire stations table. Now I have a set of rows for each isochrone, and a set of rows for each centroid.

alt text

Then in Icon Map Pro, I've enabled the circle layer, and WKT / GeoJSON (from data) layer and dragged in the corresponding fields. I also set up the label to show below the circles.

alt text

Finally I added in a reference layer with the UK's local authorities - but with West Sussex deleted. This meant that I could highlight the West Sussex area on the map.

The last touch was to add tooltips, and a Power BI slicer to be able to zoom in to specific fire stations.

alt text

Here's the final report using Icon Map Pro, and you can download the Power BI file to see how it was constructed. You will need to add your own Azure Maps API key (obtained from the Azure admin portal) as a Power Query parameter.


Source of local authority boundaries: Office for National Statistics licensed under the Open Government Licence v.3.0. Contains OS data © Crown copyright and database right 2024

Bus Routes

· 2 min read
James Dales
Co-founder of Tekantis

During the recent 30 Day Map Challenge I published a report with many of Europe's railway routes, which gained a lot of interest. I thought I'd see whether the same thing was possible for bus routes.

I'd sourced the data for the railway networks from OpenStreetMap, but it seems OpenStreetMap doesn't have sufficient data for bus routes - it's either inconsistent or partially populated.

However, I found that the Department for Transport has a Bus Open Data Site. This seems a great source of data, and I've only had chance to scratch the surface. I've started with timetable data. There's a lot of it! For starters, buses stop a lot more than trains, so there are a lot more stops, and the coordinates between each stop along the routes are detailed, as again, roads have many more turns (and roundabouts!) than railway lines.

The data downloads are large, so I've focussed on just one operator for now, the GoAhead group and downloaded their timetable data for the North East. This report represents all of their routes. A Power BI slicer enables you to pick a route, or alternatively you can search for a specific stop and all the routes that service that stop will be displayed. The chosen stop is highlighted in red.

alt text

Hovering over a stop with the mouse cursor will display the timetable information for all the buses that service that stop.

alt text

There are complexities, and nuances in the data that I haven't represented in the report, but it was a good proof of concept to see what could be achieved with bus data in a few hours. I see there's an API to retrieve the realtime locations of buses, so I'm looking forward to an excuse to explore that!