Skip to main content

3 posts tagged with "powerquery"

View All Tags

Overture Maps in Power BI using DuckDB & Icon Map Pro

· 15 min read
Brynn Borton
Co-founder of Tekantis

We recently spoke at SQLBits about how geospatial analytics is transforming thanks to modern cloud-native formats. Our talk, "Geospatial Meets Big Data: Transforming Location Intelligence," provided insights into emerging technologies reshaping location analytics.

During our presentation, we demonstrated how DuckDB could be embedded within Power BI to unlock advanced geospatial capabilities. This blog expands on our SQLBits talk, offering a more detailed exploration and practical examples.

Introducing DuckDB: A Lightweight Powerhouse

DuckDB is an open-source, lightweight, SQL database designed specifically for analytical workloads. Unlike traditional databases, DuckDB runs completely in-memory, requiring no physical database setup. Its simplicity, speed, and minimal resource usage make it ideal for cloud-based geospatial tasks, particularly within tools like Power BI.

This lightweight capability prompted us to explore its potential in enhancing Power Query's geospatial processing capabilities.

Integrating DuckDB with Power Query

Our first step involved setting up the necessary drivers. We downloaded the DuckDB ODBC driver directly from their official site DuckDB ODBC Driver and installed it locally. While exploring connectivity options, we discovered MotherDuck's custom DuckDB connector for Power BI. Initially promising, we soon realised its limitations, it provided basic table access without essential capabilities such as installing geospatial libraries or executing custom SQL. Thus, we decided to handle connectivity manually using Power Query M scripts.

After a few minutes of tinkering we realised something remarkable: we could spin‑up a fully fledged SQL database that lives only in RAM, meaning all the power of DuckDB with zero files, zero server process and zero install footprint. We instantiated the in‑memory DuckDB database with the following one‑line connection string:

ConnStr  = "Driver=DuckDB Driver;" &
"Database=:memory:;" &
"access_mode=read_write;" &
"custom_user_agent=PowerBI;",

Direct Access to Overture Data with DuckDB

Our initial practical use case was to dive straight into Overture Maps, a community-driven, open-source basemap project created collaboratively by Amazon, Microsoft, Meta, and TomTom, and hosted under the governance of the Linux Foundation. Overture provides comprehensive global map data, including detailed administrative boundaries, building footprints, transportation networks, points of interest, and more. This data is made available through regular monthly releases, which users can access directly from cloud storage platforms such as AWS S3 and Azure Blob. The diagram below provides an overview of Overture Maps' datasets and structure.

Overture Maps Overview Diagram

Using DuckDB within Power Query, we developed a reusable function to seamlessly integrate DuckDB SQL execution and the necessary geospatial libraries (spatial and httpfs) for accessing data directly from AWS S3 storage (Note: There is also a version of Overture Maps in Azure but we found S3 worked straight away so stuck with that).

Here's our reusable Power Query function:

//////////////////////////////////////////////////////////////////////
// DuckDBQuery – execute arbitrary DuckDB SQL in an in-memory DB
// ---------------------------------------------------------------
// Parameters:
// sqlText (text, required) – your SELECT / DDL / DML script
// s3Region (text, optional) – defaults to "us-west-2"
//
//////////////////////////////////////////////////////////////////////
let
DuckDBQuery =
(sqlText as text, optional s3Region as text) as table =>
let
// ── 1. choose region (default us-west-2) ──────────────────────
region = if s3Region = null then "us-west-2" else s3Region,

// ── 2. one ODBC connection string to an in-memory DB ─────────
ConnStr = "Driver=DuckDB Driver;" &
"Database=:memory:;" &
"access_mode=read_write;" &
"custom_user_agent=PowerBI;",

// ── 3. build a single SQL batch: install, load, set, run ─────
BatchSQL =
Text.Combine(
{
"INSTALL spatial;",
"INSTALL httpfs;",
"LOAD spatial;",
"LOAD httpfs;",
"SET autoinstall_known_extensions=1;",
"SET autoload_known_extensions=1;",
"SET s3_region = '" & region & "';",
sqlText
},
"#(lf)" // line-feed separator
),

// ── 4. execute and return the result table ───────────────────
Result = Odbc.Query(ConnStr, BatchSQL)
in
Result
in
DuckDBQuery

Example: Extracting US County Boundaries

Using our custom function, we started by directly accessing the area boundaries from the Overture Divisions theme. Overture data is stored as GeoParquet files in cloud storage, an enhanced version of the widely-used Apache Parquet format, which powers big data frameworks like Apache Spark and Trino. GeoParquet seamlessly blends geospatial processing capabilities with the high-performance analytics advantages of columnar data storage.

DuckDB is specifically optimised for this format and natively interacts with GeoParquet files directly in cloud environments using its built-in read_parquet function, thus enabling efficient queries against massive datasets without additional data transformations or transfers.

Specifically, we extracted county boundaries within Pennsylvania, USA.  We also leveraged DuckDB's ST_AsText function to convert binary geometries (WKB) into readable WKT formats that are fully compatible with Power BI and Icon Map Pro.

let
Source = #"DuckDB Overture SQL Script"("
SELECT id,
division_id,
names.primary,
ST_AsText(geometry) AS geometry_wkt
FROM read_parquet(
's3://overturemaps-us-west-2/release/2025-01-22.0/theme=divisions/type=division_area/*',
hive_partitioning = 1
)
WHERE subtype = 'county'
AND country = 'US'
AND region = 'US-PA';", null)
in
Source

Here was the data returned:

Overture Divisions Data

The query executed in around 25 seconds, retrieving all 67 counties. This is impressive given the dataset contains over 5.5 million divisions within a 6.4GB GeoParquet file, accessed in-place from AWS cloud storage.

Addressing Column Truncation Issues

Initially, our geometry data was truncated to 200 characters due to metadata handling limitations between Power Query and the DuckDB ODBC driver. When the ODBC driver does not expose column length information, Power BI defaults to a maximum of 200 characters for string columns. To resolve this, we explicitly cast our geometry data to a wider column type.

CAST(ST_AsText(geometry) AS VARCHAR(30000))

This ensured the most geometry polygons were retrieved successfully but a few were still over the 30000 character limit.

Simplifying the Polygons

To simplify the polygons, DuckDB has another handy function ST_SimplifyPreserveTopology.

Also to simplify usability, especially for analysts unfamiliar with SQL, we extended our PowerQuery function, allowing users to query Overture themes, specify column selections, conditions, and simplify polygons directly via function parameters.

//////////////////////////////////////////////////////////////////////
// OvertureQuery – default boundary simplification //
//////////////////////////////////////////////////////////////////////
let
OvertureQuery =
(
theme as text, //Overture maps theme
ctype as text, //Overture Maps Dataset
optional columnsTxt as nullable text, //list of columns, defaults to all
optional includeGeomWkt as nullable logical, //include the geometry as WKT
optional whereTxt as nullable text, //the where clause to filter the data
optional s3Region as nullable text, //options s3 region, defaults to us-west-2
optional simplifyTolDeg as nullable number // degrees tolerance, default 0.0005, 0=none
) as table =>
let
// ── defaults ────────────────────────────────────────────────
region = if s3Region = null then "us-west-2" else s3Region,
colsRaw = if columnsTxt = null or columnsTxt = "" then "*" else columnsTxt,
addWkt = if includeGeomWkt = null then true else includeGeomWkt,
tolDegDefault = 0.0005, // ≈ 55 m at the equator
tolDeg = if simplifyTolDeg = null then tolDegDefault else simplifyTolDeg,

// ── geometry expression ────────────────────────────────────
geomExpr = if tolDeg > 0
then "ST_SimplifyPreserveTopology(geometry, " &
Number.ToText(tolDeg, "0.############") & ")"
else "geometry",

// ── SELECT list ────────────────────────────────────────────
selectList = if addWkt
then Text.Combine(
{ colsRaw,
"CAST(ST_AsText(" & geomExpr & ") as VARCHAR(30000)) AS geometry_wkt" },
", ")
else colsRaw,

// ── WHERE clause (optional) ────────────────────────────────
whereClause = if whereTxt = null or Text.Trim(whereTxt) = ""
then ""
else "WHERE " & whereTxt,

// ── parquet URL ─────────────────────────────────────────────
parquetUrl =
"s3://overturemaps-" & region & "/" &
"release/2025-01-22.0/" &
"theme=" & theme & "/type=" & ctype & "/*",

// ── ODBC connection string (in-memory DuckDB) ──────────────
ConnStr =
"Driver=DuckDB Driver;" &
"Database=:memory:;" &
"access_mode=read_write;" &
"custom_user_agent=PowerBI;",

// ── SQL batch ───────────────────────────────────────────────
SqlBatch = Text.Combine(
{
"INSTALL spatial;",
"INSTALL httpfs;",
"LOAD spatial;",
"LOAD httpfs;",
"SET autoinstall_known_extensions = 1;",
"SET autoload_known_extensions = 1;",
"SET s3_region = '" & region & "';",
"",
"SELECT " & selectList,
"FROM read_parquet('" & parquetUrl & "', hive_partitioning = 1)",
whereClause & ";"
},
"#(lf)"
),

// ── run and return ──────────────────────────────────────────
OutTable = Odbc.Query(ConnStr, SqlBatch)
in
OutTable
in
OvertureQuery

Now we can easily invoke queries without writing any SQL and also simplify the polygons:

Source = #"DuckDB Overture Query with Simplify"(
"divisions",
"division_area",
"id, division_id, names.primary",
true,
"subtype = 'county' AND country = 'US' AND region = 'US-PA'",
null,
null)

We now have all polgons with a WKT string under the 30K limits, so we can easily visualise it in Icon Map Pro

Overture Divisions Data

Pushing the Limits: Querying 1.65 Million Buildings in NYC

Encouraged by initial success, we challenged DuckDB further using the significantly larger buildings dataset. Previously, querying these datasets using Apache Spark in Databricks resulted in poor performance. DuckDB, however, excelled even on our local laptop.

Now, we wanted to efficiently query buildings within a specific area, and the most effective way to do this is by using a bounding box. A bounding box defines a rectangular region with minimum and maximum coordinates, providing a quick way to filter geographic data. Ideally, we wanted to avoid manually calculating these coordinates. We noticed during our earlier exploration of the Overture divisions theme that each area already included a bbox column containing these coordinates. This gave us the idea to leverage these existing bounding boxes to easily geocode a city or area name. Once we had the bounding box, we could efficiently retrieve all buildings within that region by simply filtering based on these numeric values, eliminating the need for complex spatial queries.

The following is a Power Query function we created to easily retrieve a bounding box based on a city or area name and its standard 2-character ISO country code:

//////////////////////////////////////////////////////////////////////
// GetCityBBox – returns "lat1,lon1,lat2,lon2" for a settlement //
//////////////////////////////////////////////////////////////////////
let
GetCityBBox =
(
cityName as text, // e.g. "London"
isoCountry as text, // ISO-3166 alpha-2, e.g. "GB"
optional s3Region as nullable text
) as text =>
let
// ── query the locality polygon -----------------------------
rows =
OvertureQuery(
"divisions",
"division_area",
"bbox.xmin, bbox.ymin, bbox.xmax, bbox.ymax",
false, // no geometry_wkt
"subtype = 'locality' and " &
"country = '" & isoCountry & "' and " &
"names.primary ILIKE '" & cityName & "%'",
null, // no extra bounding box
s3Region, // defaults to us-west-2
0 // no simplification
),

// ── pick the first match (refine filter if ambiguous) -----
firstRow =
if Table.IsEmpty(rows)
then error "City not found in divisions data"
else Table.First(rows),

// ── assemble lat/lon string -------------------------------
bboxTxt =
Number.ToText(firstRow[bbox.ymin]) & "," &
Number.ToText(firstRow[bbox.xmin]) & "," &
Number.ToText(firstRow[bbox.ymax]) & "," &
Number.ToText(firstRow[bbox.xmax])
in
bboxTxt
in
GetCityBBox

Filtering with Bounding Boxes

We then modified our primary query function to include bounding box filtering capability:

//////////////////////////////////////////////////////////////////////
// OvertureQuery – with optional bounding box //
//////////////////////////////////////////////////////////////////////
let
OvertureQuery =
(
theme as text,
ctype as text,
optional columnsTxt as nullable text,
optional includeGeomWkt as nullable logical,
optional whereTxt as nullable text,
optional boundingBoxTxt as nullable text, // lat1,lon1,lat2,lon2
optional s3Region as nullable text,
optional simplifyTolDeg as nullable number // tolerance in degrees, 0 ⇒ none
) as table =>
let
// ── defaults ────────────────────────────────────────────────
region = if s3Region = null then "us-west-2" else s3Region,
colsRaw = if columnsTxt = null or columnsTxt = "" then "*" else columnsTxt,
addWkt = if includeGeomWkt = null then true else includeGeomWkt,
tolDegDefault = 0.0005, // ≈ 55 m at the equator
tolDeg = if simplifyTolDeg = null then tolDegDefault else simplifyTolDeg,

// ── geometry expression ────────────────────────────────────
geomExpr = if tolDeg > 0
then "ST_SimplifyPreserveTopology(geometry, " &
Number.ToText(tolDeg, "0.############") & ")"
else "geometry",

// ── SELECT list ────────────────────────────────────────────
selectList = if addWkt
then Text.Combine(
{ colsRaw,
"CAST(ST_AsText(" & geomExpr & ") AS VARCHAR(30000)) AS geometry_wkt" },
", ")
else colsRaw,

// ── optional bounding-box condition ───────────────────────
bboxCondRaw =
if boundingBoxTxt = null or Text.Trim(boundingBoxTxt) = "" then ""
else
let
nums = List.Transform(
Text.Split(boundingBoxTxt, ","),
each Number.From(Text.Trim(_))),
_check = if List.Count(nums) <> 4
then error "Bounding box must have four numeric values"
else null,
lat1 = nums{0},
lon1 = nums{1},
lat2 = nums{2},
lon2 = nums{3},
minLat = if lat1 < lat2 then lat1 else lat2,
maxLat = if lat1 > lat2 then lat1 else lat2,
minLon = if lon1 < lon2 then lon1 else lon2,
maxLon = if lon1 > lon2 then lon1 else lon2
in
"bbox.xmin >= " & Number.ToText(minLon, "0.######") &
" AND bbox.xmax <= " & Number.ToText(maxLon, "0.######") &
" AND bbox.ymin >= " & Number.ToText(minLat, "0.######") &
" AND bbox.ymax <= " & Number.ToText(maxLat, "0.######"),

// ── combine WHERE pieces ───────────────────────────────────
whereTxtTrim = if whereTxt = null then "" else Text.Trim(whereTxt),
wherePieces = List.Select({ whereTxtTrim, bboxCondRaw }, each _ <> ""),
whereClause = if List.Count(wherePieces) = 0
then ""
else "WHERE " & Text.Combine(wherePieces, " AND "),

// ── parquet URL ─────────────────────────────────────────────
parquetUrl =
"s3://overturemaps-" & region & "/" &
"release/2025-01-22.0/" &
"theme=" & theme & "/type=" & ctype & "/*",

// ── ODBC connection string (in-memory DuckDB) ──────────────
ConnStr =
"Driver=DuckDB Driver;" &
"Database=:memory:;" &
"access_mode=read_write;" &
"custom_user_agent=PowerBI;",

// ── SQL batch ───────────────────────────────────────────────
SqlBatch = Text.Combine(
{
"INSTALL spatial;",
"INSTALL httpfs;",
"LOAD spatial;",
"LOAD httpfs;",
"SET autoinstall_known_extensions = 1;",
"SET autoload_known_extensions = 1;",
"SET s3_region = '" & region & "';",
"",
"SELECT " & selectList,
"FROM read_parquet('" & parquetUrl & "', hive_partitioning = 1)",
whereClause & ";"
},
"#(lf)"
),

// ── run and return ──────────────────────────────────────────
OutTable = Odbc.Query(ConnStr, SqlBatch)
in
OutTable
in
OvertureQuery

Now we were ready to query the building outlines, along with a bunch of additional attributes that might be useful for further analysis. So let’s pull it all together and really put it to the test by attempting to extract all the buildings in New York City. We used our bounding box function to locate the spatial extent of the city, then passed this to our main query function.

let
NewYorkBBox = GetCityBBox("City Of New York", "US"),
NewYorkBuildings =
OvertureQuery(
"buildings",
"building",
"id, names.primary, class, height, subtype, class, num_floors, is_underground, facade_color, facade_material, roof_material, roof_shape, roof_color, roof_height",
true,
null,
NewYorkBBox,
null,
null
)
in
NewYorkBuildings

Amazingly, within around 5 minutes, we had extracted all 1.65 million buildings in New York City, complete with building outlines and a wealth of rich attributes. Even more impressive was that Power BI's resource usage remained minimal.

Spatial Joins: Assigning Buildings to Neighbourhoods

That is pretty cool, but even though Icon Map Pro can display up to half a million polygons on a map, 1.65 million buildings is just too many to visualise effectively. To manage this, we needed a way to filter buildings by area. Unfortunately, the Overture data does not provide a direct foreign key between buildings and divisions such as neighbourhoods, so we needed to perform a spatial join. No problem, however, because we can do exactly that using DuckDB's spatial functions.

This required switching back to SQL, as the query is more complex. We first selected the New York neighbourhoods from the division_area layer, then retrieved all the buildings within the city’s bounding box, and finally performed a spatial join using ST_Intersects to associate each building with its corresponding area.

let
Source = #"DuckDB Overture SQL Script"("
-- 1. Pull the New York City divisions you care about
WITH areas AS (
SELECT
id AS area_id,
names.primary AS area_name,
subtype,
geometry AS geom
FROM read_parquet(
's3://overturemaps-us-west-2/release/2025-01-22.0/theme=divisions/type=division_area/*',
hive_partitioning = 1
)
WHERE country = 'US'
AND region = 'US-NY'
AND subtype IN ('neighborhood')
),

-- 2. Load buildings inside the city’s overall bounding box first
bldg AS (
SELECT
id AS building_id,
names.primary AS building_name,
class,
height,
subtype,
class,
num_floors,
is_underground,
facade_color,
facade_material,
roof_material,
roof_shape,
roof_color,
roof_height,
geometry AS geom
FROM read_parquet(
's3://overturemaps-us-west-2/release/2025-01-22.0/theme=buildings/type=building/*',
hive_partitioning = 1
)
WHERE bbox.xmin > -74.3 AND bbox.xmax < -73.5
AND bbox.ymin > 40.4 AND bbox.ymax < 41.0
)

-- 3. Spatial join: which building sits in which area?
SELECT
a.area_name,
a.subtype,
b.building_id,
b.building_name,
b.class,
b.height,
b.subtype,
b.class,
b.num_floors,
b.is_underground,
b.facade_color,
b.facade_material,
b.roof_material,
b.roof_shape,
b.roof_color,
b.roof_height,
ST_AsText(b.geom) AS geometry_wkt
FROM bldg b
JOIN areas a
ON ST_Intersects(a.geom, b.geom);
", null),
#"Filtered Rows" = Table.SelectRows(Source, each true)
in
#"Filtered Rows"

Incredibly, we managed to join all 1.65 million rows in memory within Power BI on a standard laptop. Even more impressive, when monitoring system performance, Power BI Desktop was only using about 10% of CPU and under 4GB of memory, barely 1GB more than its normal baseline usage. Definitely some voodoo magic going on somewhere!

Visualising Results in Icon Map Pro

Finally, we visualised the results using Icon Map Pro within Power BI, colouring the buildings by type, and cross-filtering buildings by neighbourhood, type, height, and material properties.

Overture Maps Buildings Icon Map Pro

So, What's the Catch?

Currently, the only limitation is that DuckDB's ODBC driver isn't included by default in Power BI Service. Hence, similar to other third-party ODBC drivers, a Power BI Gateway server is required. Thankfully, setting this up is straightforward, simply spin up an Azure VM, install the DuckDB driver, and configure it as your gateway. You’ll then fully leverage DuckDB's capabilities in your Power BI cloud environment and be able to refresh on a schedule like any other data.

Explore the Possibilities

We encourage you to explore and innovate with this approach, leveraging Overture Maps or your own spatial datasets. Experiment freely and share your exciting findings with us!

You can download a copy of the PBIX file here which includes all the powerquery functions and some other goodies like using DuckDB for generating H3 hexagons, another blog post coming soon on that one!

Interactive Spatial Data Operations in Fabric SQL

· 8 min read
James Dales
Co-founder of Tekantis

With SQL Server now in public preview inside of Microsoft Fabric, I was keen to investigate it's spatial data functions. Whilst the Event House has provided spatial data processing inside of Fabric since the outset with KQL database, I was excited when the SQL engine arrived as it's a more familiar environment to those working with spatial data. It also opens up the door for doing realtime processing of that data based on interactions in a Power BI report.

Fabric SQL has strong support for geospatial datatypes and functions opening up a wide range of operations such nearest neighbours, distance calculations, boundary intersections etc etc.

In this example I wanted to keep things simple and show properties within a selected distance from selected postboxes. The postboxes would be selected by a Power BI slicer, as would the distance required. Then on making these selections, SQL server would then find the properties within those search radiuses.

alt text

First of all I needed some data to use, so I extracted from OpenStreetMap the locations of all the postboxes in Buckinghamshire in the UK using the QuickOSM plugin for QGIS. I also did the same for all the buildings in Buckinghamshire. There's only partial data for building outlines in Buckinghamshire, a better datasource for these would have probably been Ordnance Survey data. I saved these two separate data extracts in CSV format with WKT geometry.

So with my data ready, I headed over to Microsoft Fabric, and was quickly and easily able to create SQL database in just 2 clicks. Importing the data was also straightforward using DataFlows Gen 2 to upload the CSV file into OneDrive and import into 2 tables within the database using the familiar PowerQuery interface. The only downside here is that PowerQuery can't create fields of type geography so the WKT data is added as a text field. But this is quickly remedied once the data is loaded.

I also added spatial indexes on the columns containing the spatial data to improve performance.

alt text

With my data loaded I was now ready to create a function to to perform the spatial searches when triggered from Power BI. I opted to use a table function as I can then parameterize the search criteria using PowerQuery parameters - I've not been successful in achieving this with stored procedures.

From Power BI, I pass in a list of comma separated postbox references and the distance in meters to search. The first thing we need to do is process that list of postboxes:

WITH RefList AS (
SELECT TRIM(value) AS ref FROM STRING_SPLIT(@Ref, ',')
)

With that done we can use the STDistance spatial function to calculate the distance between the buildings and the selected postboxes, and return those within the specified distance:

-- Return buildings where the distance from the postboxes is less than @Distance
SELECT
b.full_id,
b.geom.STAsText() AS WKT,
b.building,
'building' AS itemtype
FROM dbo.buildings AS b INNER JOIN dbo.postboxes AS p ON p.ref IN (select ref from RefList)
WHERE b.geom.STDistance(p.geom) <= @Distance

As well as showing the matched properties in the results, I also wanted to show the locations of the postboxes, so we use a UNION ALL to add the postbox locations into the results:


UNION ALL

-- Return a points for the postbox locations
SELECT
full_id,
p2.geom.STAsText() AS WKT,
'' AS building,
'origin' AS itemtype
FROM dbo.postboxes AS p2
WHERE p2.ref IN (select ref from RefList)

And then finally I also wanted to return the search radius to show on the map:

UNION ALL

-- Return the perimeter of the search areas as polygons
SELECT
p3.ref + 'perimeter' AS full_id,
p3.geom.STBuffer(@Distance).STAsText() AS WKT,
'' AS building,
'search_perimeter' AS itemtype
FROM dbo.postboxes AS p3
WHERE p3.ref IN (select ref from RefList)

Here's the complete function:


ALTER FUNCTION dbo.fn_GetBuildingsWithinDistance
(
@Ref nvarchar (4000),
@Distance FLOAT
)
RETURNS TABLE
AS
RETURN
(

WITH RefList AS (
SELECT TRIM(value) AS ref FROM STRING_SPLIT(@Ref, ',')
)

-- Return buildings where the distance from the postboxes is less than @Distance
SELECT
b.full_id,
b.geom.STAsText() AS WKT,
b.building,
'building' AS itemtype
FROM dbo.buildings AS b INNER JOIN dbo.postboxes AS p ON p.ref IN (select ref from RefList)
WHERE b.geom.STDistance(p.geom) <= @Distance

UNION ALL

-- Return a points for the postbox locations
SELECT
full_id,
p2.geom.STAsText() AS WKT,
'' AS building,
'origin' AS itemtype
FROM dbo.postboxes AS p2
WHERE p2.ref IN (select ref from RefList)

UNION ALL

-- Return the perimeter of the search areas as polygons
SELECT
p3.ref + 'perimeter' AS full_id,
p3.geom.STBuffer(@Distance).STAsText() AS WKT,
'' AS building,
'search_perimeter' AS itemtype
FROM dbo.postboxes AS p3
WHERE p3.ref IN (select ref from RefList)
);

This is all the setup required within Fabric SQL, so now over to Power BI Desktop.

I was able to connect to Fabric SQL in the same way as I'd connect to any SQL DB, so that was easy, and I simply imported the postbox table as import mode into my Power BI model. I was't worried about any geometry here, just the reference numbers, although in real life, I'd have imported additional reference data for improved searching and selection.

Now to bring in the building search results. I next made to parameters in PowerQuery used to pass the selected postboxes and the search distance.

alt text

Now we're ready to create the connection to the table function. As we want the searching to be done interactively, we need to ensure the connection to the table function is made using Direct Query. This means that the SQL query is executed and new results returned every time we interact with the slicers (or other visuals) in the report.

After making the initial connection, I then edited the query in the Advanced Editor. This is because we need to add a little code to handle the multi-selection of postboxes. There's likely a better approach for this, but this seems to work well. First of all I handle a single value vs multiple, and then create a comma separated text string with the selected values:

alt text

Note we're using the parameters postbox_ref and range that I configured earlier.

Here's the M code:

let
// Normalize postbox_ref to a list if it's a single string
normalizedList = if Value.Is(postbox_ref, type list) then postbox_ref else {postbox_ref},

// Combine into a single comma-separated string
commaSeparated = Text.Combine(normalizedList, ","),

Source = Sql.Database("yourdatabaseservergoeshere.database.fabric.microsoft.com", "spatialsql-314d84ad-21da-438c-9caf-76f51b7259d9"),
dbo_fn_GetBuildingsWithinDistance = Source{[Schema="dbo",Item="fn_GetBuildingsWithinDistance"]}[Data],
#"Invoked Functiondbo_fn_GetBuildingsWithinDistance1" = dbo_fn_GetBuildingsWithinDistance(commaSeparated, range)
in
#"Invoked Functiondbo_fn_GetBuildingsWithinDistance1"

That's the data connections sorted. Now over to Power BI's report editor. The queries should load in all the postbox references into a table, and the table function should be called using the default values defined in the PowerQuery parameters.

We now need to set up the interactivity part of the report. So I added a slicer based on the post code reference.

alt text

But this won't currently change how the table function is called - we need to link up this to the Power Query function. We do this in the Power BI Model View.

Selecting the postcode reference field on the right, shows the properties panel. We need expand the Advanced section and in the "Bind to parameter" section, select our PowerQuery postbox_ref parameter. And then also enable Multi-Select.

alt text

This then binds this field to the PowerQuery parameter. When we select an item in the slicer now, that value is passed into the SQL query.

We also need to do something similar for the distance selection. For this I created a numeric range parameter:

alt text

alt text

This then provides us with a table of numbers, and a slicer, but we need to bind it to the PowerQuery parameter before we can use it. So back in the Model view, bind it to the range parameter a we did for the postbox_ref parameter, but this time don't enable Multi-select:

alt text

And now we're ready to display the results, which of course I used Icon Map Pro for.

The field configuration is straight forward, I add the full_id field into the ID field, and the geom field into the Image / WKT field:

alt text

Then in formatting settings for Icon Map Pro, in the Data Layers I just enabled WKT / GeoJson (from data):

alt text

And set up some conditional formatting for the buildings vs the postboxes and search radius:

alt text

eg:

alt text

Here's a video showing it in action:

Geocoding with PowerQuery

· 3 min read
James Dales
Co-founder of Tekantis

Unlike the AzureMaps and ArcGIS Power BI visuals, Icon Map Pro doesn't have the ability to geocode data - to take a postal address or placename and turn it into a longitude and latitude to display on the map. This was an intentional decision for a number of reasons:

  1. Doing this on the fly creates a performance hit before displaying the data, and it has to be done for each user browsing the map.
  2. For security - many of our customers don't want the visual to be able to process data remotely.

A better approach is to geocode the data up front. You can then control where this process happens, and do it once as part of the data ingestion, meaning there is no delay or workload duplication.

Here's a guide to using the Azure Maps geocoding API in PowerQuery to convert addresses to longitude and latitude coordinates.

In this example I have a table of London hospitals and their addresses.

alt text

To add the geocoding functionality into PowerQuery I'm going to add a function.

Add a New Source as a Blank Query:

alt text

then click the Advanced Editor button on the toolbar.

alt text

alt text

Replace the default contents with this M code:

let
// Function to call the Azure Maps Geocoding API
CallAzureMapsGeocodeApi = (ApiKey as text, Address as text) as record =>
let
// Define the API endpoint
ApiUrl = "https://atlas.microsoft.com/search/address/json?api-version=1.0&subscription-key=" & ApiKey & "&query=" & Uri.EscapeDataString(Address),

// Make the GET request
Response = Web.Contents(ApiUrl),

// Parse the JSON response
JsonResponse = Json.Document(Response),

// Extract the first result
FirstResult = if List.Count(JsonResponse[results]) > 0 then JsonResponse[results]{0} else null,

// Extract the latitude and longitude
Coordinates = if FirstResult <> null then FirstResult[position] else null,

// Return a record with latitude and longitude
Result = if Coordinates <> null then [Longitude = Coordinates[lon], Latitude = Coordinates[lat]] else [Longitude = null, Latitude = null]
in
Result
in
CallAzureMapsGeocodeApi

Rename your query to Geocoder

alt text

Then click back on your query containing the address data

alt text

On the toolbar, in the "Add Column" section, click the "Invoke Custom Function" button:

alt text

Enter Location as the column name. Select Geocoder as the Function query. Then under ApiKey click the button and select 'Text'.

alt text

Enter your Azure Maps API Key. This is shown as the "Primary Key" in the "Authentication" section within the "Settings" section of your Azure Maps account.

alt text

Then under Address select Column Name and select the address column.

alt text

Click OK.

The function will now be called for the rows in your table:

alt text

Click the expand button in the new Location column header:

alt text

Untick the "Use original column name as prefix" checkbox

alt text

And click ok.

You should now have Longitude and Latitude columns in your table.

Highlight the two columns and change the Data Type to Decimal Number.

alt text

Click the Close & Apply button to load the data into your Power BI dataset.

alt text

You can now configure Icon Map Pro with your geocoded data:

alt text