4.1. Create a Peripleo Web Map from Disambiguated Data#
This notebook demonstrates how to transform disambiguated data into a publicly accessible Peripleo web map.
It illustrates the potential of data produced by disambiguation processes and provides a step-by-step pipeline for generating an interactive map interface.
Tip: To avoid conflicts, make a copy of this Colab notebook at the outset and work within that:
File > Save a Copy
4.1.1. Input#
The notebook expects a spreadsheet of disambiguated data.
Each row should include:
Place information
Metadata for:
Object IDObject TitleAssociated
Wikidata ID
4.1.2. Output#
The notebook produces a URL to a publicly available Peripleo web map, displaying the disambiguated places and associated objects.
4.1.3. Pipeline Overview#
The notebook performs the following steps:
Query Wikidata to obtain coordinate data for each object (row).
Generate GeoJSON in the Linked Places format required by Peripleo.
Publish the data to GitHub Pages within a cloned Peripleo repository.
The resulting web map allows you to explore the disambiguated dataset in an interactive, geographic context.
4.1.4. 1. Create GitHub Secret for Colab#
This step allows the notebook to access your GitHub account via the GitHub API. The publication step below will not work without it.
⚠️ Warning: Keep your token private and do not share it outside this notebook.
4.1.4.1. Prerequisites#
You must have a GitHub account.
4.1.4.2. Steps to Generate a Personal Access Token (Classic)#
On the left, select Tokens (classic)
Click Generate new token (classic)
Add a descriptive Note for your reference
Under Select scopes, tick the repo checkbox
Scroll down and click Generate token
Copy the token value displayed — this is your secret
4.1.4.3. Steps to Add the Token in Colab#
In Colab, click the key icon on the left sidebar
Click Add new secret
Set:
Name:
GITHUB_TOKENValue: paste the token you copied
Toggle Use Notebook Access to allow this notebook to access the secret
#@title 2. Install Dependencies
!pip install folium SPARQLWrapper geojson PyGithub geopandas fiona
Requirement already satisfied: folium in /usr/local/lib/python3.12/dist-packages (0.20.0)
Collecting SPARQLWrapper
Downloading SPARQLWrapper-2.0.0-py3-none-any.whl.metadata (2.0 kB)
Collecting geojson
Downloading geojson-3.2.0-py3-none-any.whl.metadata (16 kB)
Collecting PyGithub
Downloading pygithub-2.8.1-py3-none-any.whl.metadata (3.9 kB)
Requirement already satisfied: geopandas in /usr/local/lib/python3.12/dist-packages (1.1.1)
Collecting fiona
Downloading fiona-1.10.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (56 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.6/56.6 kB 2.8 MB/s eta 0:00:00
?25hRequirement already satisfied: branca>=0.6.0 in /usr/local/lib/python3.12/dist-packages (from folium) (0.8.1)
Requirement already satisfied: jinja2>=2.9 in /usr/local/lib/python3.12/dist-packages (from folium) (3.1.6)
Requirement already satisfied: numpy in /usr/local/lib/python3.12/dist-packages (from folium) (2.0.2)
Requirement already satisfied: requests in /usr/local/lib/python3.12/dist-packages (from folium) (2.32.4)
Requirement already satisfied: xyzservices in /usr/local/lib/python3.12/dist-packages (from folium) (2025.4.0)
Collecting rdflib>=6.1.1 (from SPARQLWrapper)
Downloading rdflib-7.1.4-py3-none-any.whl.metadata (11 kB)
Collecting pynacl>=1.4.0 (from PyGithub)
Downloading pynacl-1.6.0-cp38-abi3-manylinux_2_34_x86_64.whl.metadata (9.4 kB)
Requirement already satisfied: pyjwt>=2.4.0 in /usr/local/lib/python3.12/dist-packages (from pyjwt[crypto]>=2.4.0->PyGithub) (2.10.1)
Requirement already satisfied: typing-extensions>=4.5.0 in /usr/local/lib/python3.12/dist-packages (from PyGithub) (4.15.0)
Requirement already satisfied: urllib3>=1.26.0 in /usr/local/lib/python3.12/dist-packages (from PyGithub) (2.5.0)
Requirement already satisfied: pyogrio>=0.7.2 in /usr/local/lib/python3.12/dist-packages (from geopandas) (0.11.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.12/dist-packages (from geopandas) (25.0)
Requirement already satisfied: pandas>=2.0.0 in /usr/local/lib/python3.12/dist-packages (from geopandas) (2.2.2)
Requirement already satisfied: pyproj>=3.5.0 in /usr/local/lib/python3.12/dist-packages (from geopandas) (3.7.2)
Requirement already satisfied: shapely>=2.0.0 in /usr/local/lib/python3.12/dist-packages (from geopandas) (2.1.1)
Requirement already satisfied: attrs>=19.2.0 in /usr/local/lib/python3.12/dist-packages (from fiona) (25.3.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.12/dist-packages (from fiona) (2025.8.3)
Requirement already satisfied: click~=8.0 in /usr/local/lib/python3.12/dist-packages (from fiona) (8.2.1)
Collecting click-plugins>=1.0 (from fiona)
Downloading click_plugins-1.1.1.2-py2.py3-none-any.whl.metadata (6.5 kB)
Collecting cligj>=0.5 (from fiona)
Downloading cligj-0.7.2-py3-none-any.whl.metadata (5.0 kB)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.12/dist-packages (from jinja2>=2.9->folium) (3.0.2)
Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.12/dist-packages (from pandas>=2.0.0->geopandas) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.12/dist-packages (from pandas>=2.0.0->geopandas) (2025.2)
Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.12/dist-packages (from pandas>=2.0.0->geopandas) (2025.2)
Requirement already satisfied: cryptography>=3.4.0 in /usr/local/lib/python3.12/dist-packages (from pyjwt[crypto]>=2.4.0->PyGithub) (43.0.3)
Requirement already satisfied: cffi>=1.4.1 in /usr/local/lib/python3.12/dist-packages (from pynacl>=1.4.0->PyGithub) (1.17.1)
Requirement already satisfied: pyparsing<4,>=2.1.0 in /usr/local/lib/python3.12/dist-packages (from rdflib>=6.1.1->SPARQLWrapper) (3.2.3)
Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests->folium) (3.4.3)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/dist-packages (from requests->folium) (3.10)
Requirement already satisfied: pycparser in /usr/local/lib/python3.12/dist-packages (from cffi>=1.4.1->pynacl>=1.4.0->PyGithub) (2.22)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.12/dist-packages (from python-dateutil>=2.8.2->pandas>=2.0.0->geopandas) (1.17.0)
Downloading SPARQLWrapper-2.0.0-py3-none-any.whl (28 kB)
Downloading geojson-3.2.0-py3-none-any.whl (15 kB)
Downloading pygithub-2.8.1-py3-none-any.whl (432 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 432.7/432.7 kB 8.8 MB/s eta 0:00:00
?25hDownloading fiona-1.10.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 17.2/17.2 MB 53.4 MB/s eta 0:00:00
?25hDownloading click_plugins-1.1.1.2-py2.py3-none-any.whl (11 kB)
Downloading cligj-0.7.2-py3-none-any.whl (7.1 kB)
Downloading pynacl-1.6.0-cp38-abi3-manylinux_2_34_x86_64.whl (1.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 56.9 MB/s eta 0:00:00
?25hDownloading rdflib-7.1.4-py3-none-any.whl (565 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 565.1/565.1 kB 19.0 MB/s eta 0:00:00
?25hInstalling collected packages: rdflib, geojson, cligj, click-plugins, SPARQLWrapper, pynacl, fiona, PyGithub
Successfully installed PyGithub-2.8.1 SPARQLWrapper-2.0.0 click-plugins-1.1.1.2 cligj-0.7.2 fiona-1.10.1 geojson-3.2.0 pynacl-1.6.0 rdflib-7.1.4
#@title 3. Query Wikidata to Retrieve Coordinates
#@markdown Your spreadsheet should include (at least) the following columns:
#@markdown - `entity_label` – the name of the place or object
#@markdown - `entity_text` – additional descriptive text
#@markdown - `wikidata_uri` – the corresponding Wikidata ID
#@markdown Enter the **URL or file path** to your spreadsheet in the box below:
import pandas as pd
import time # Import the time module
URL = "http://145.38.185.232/enriching/disambiguation_annotation.csv" #@param {type:'string'}
#@title Function to fetch from Wikidata
from SPARQLWrapper import SPARQLWrapper, JSON
def query_wikidata(uri, endpoint="https://query.wikidata.org/sparql", cache=dict()):
if uri in cache:
return cache[uri]
q = """
SELECT DISTINCT ?uri ?uriLabel ?uriDescription ?latitude ?longitude WHERE {
?uri wdt:P31|wdt:P279 [] .
OPTIONAL {
?uri p:P625 ?coordinate.
?coordinate ps:P625 ?coord.
?coordinate psv:P625 ?coordinate_node.
?coordinate_node wikibase:geoLongitude ?longitude.
?coordinate_node wikibase:geoLatitude ?latitude.
}
VALUES ?uri { <URIHIER> }
SERVICE wikibase:label { bd:serviceParam wikibase:language "en,nl,de,fr,it,es,". }
}
""".replace(
"URIHIER", uri
)
sparql = SPARQLWrapper(endpoint)
sparql.setQuery(q)
sparql.setReturnFormat(JSON)
results = sparql.query().convert()
if len(results["results"]["bindings"]) == 0:
return "", "", "", ""
label = results["results"]["bindings"][0]["uriLabel"]["value"]
description = (
results["results"]["bindings"][0].get("uriDescription", {}).get("value")
)
latitude = results["results"]["bindings"][0].get("latitude", {}).get("value")
longitude = results["results"]["bindings"][0].get("longitude", {}).get("value")
cache[uri] = label, description, latitude, longitude
return label, description, latitude, longitude
#Test Wikidata Query
# cache = dict()
# uri = "http://www.wikidata.org/entity/Q43631"
# label, description, latitude, longitude = query_wikidata(uri, cache=cache)
# print(f"Label: {label}")
# print(f"Description: {description}")
# print(f"Latitude: {latitude}")
# print(f"Longitude: {longitude}")
def enrich_df(df: pd.DataFrame) -> pd.DataFrame:
cache = dict()
for i, row in df.iterrows():
if row.entity_label != "LOC":
continue
elif pd.isna(row.wikidata_uri):
continue
if '/wiki/' in row.wikidata_uri:
uri = row.wikidata_uri.replace('https://www.wikidata.org/wiki/', 'http://www.wikidata.org/entity/')
df.loc[i, "wikidata_uri"] = uri
else:
uri = row.wikidata_uri
label, description, latitude, longitude = query_wikidata(uri, cache=cache)
df.loc[i, "wikidata_label"] = label
df.loc[i, "wikidata_description"] = description
df.loc[i, "wikidata_latitude"] = latitude
df.loc[i, "wikidata_longitude"] = longitude
time.sleep(0.5) # Add a small delay between queries
return df
df = pd.read_csv(URL)
df.head()
df = enrich_df(df)
df[df['entity_label'] == 'LOC'].head()
#@title 4. Generate GeoJSON in the Linked Places Format (LPF)
#@markdown This step converts your spreadsheet data into **GeoJSON** following the **[Linked Places Format (LPF)](https://github.com/LinkedPasts/linked-places-format)**, which is the standard format used by the Peripleo web map for visualising place-based data.
from geojson import Feature, Point
import json
import pandas as pd
import geopandas as gpd
from shapely.geometry import Point as ShapelyPoint
def get_robust_bounds(geometries):
"""
Computes a bounding box for a set of Shapely geometries,
handling cases where geometries cross the antimeridian.
Returns a bbox in the format [min_lon, min_lat, max_lon, max_lat].
"""
if not geometries:
return None
# Get the simple min/max bounds first
all_lon = [g.bounds[0] for g in geometries] + [g.bounds[2] for g in geometries]
all_lat = [g.bounds[1] for g in geometries] + [g.bounds[3] for g in geometries]
min_lon = min(all_lon)
min_lat = min(all_lat)
max_lon = max(all_lon)
max_lat = max(all_lat)
# Check for antimeridian crossing
# If the span is > 180, the points must cross the antimeridian
if (max_lon - min_lon) > 180:
min_lon = -180
max_lon = 180
return [min_lon, min_lat, max_lon, max_lat]
def df2lp(df: pd.DataFrame) -> dict:
features = []
geometries = []
# Ensure the DataFrame is sorted for correct grouping logic
df_sorted = df.sort_values(by='wikidata_uri')
for _, row in df_sorted.iterrows():
if row.entity_label != "LOC" or pd.isna(row.wikidata_uri):
continue
if not row.wikidata_latitude or not row.wikidata_longitude:
continue
# Construct the GeoJSON Feature
longitude = float(row.wikidata_longitude)
latitude = float(row.wikidata_latitude)
point = Point([longitude, latitude])
uri = f'https://knowledgebase.sloanelab.org/resource/?uri=http%3A%2F%2Fsloanelab.org%2FE73%2Fbm_dataset%2F{row["Unique ID"]}'
feature = Feature(
geometry=point,
properties={
"title": row['Unique ID'],
"description": "Description of the object",
"url": f"https://www.britishmuseum.org/collection/object/{row['Unique ID']}"
},
# Simplified links for clarity
links=[{
"type": "seeAlso",
"label": row['Unique ID'],
"url": uri
}]
)
feature["@id"] = uri
features.append(feature)
# Collect Shapely geometries for bounding box calculation
geometries.append(ShapelyPoint(longitude, latitude))
# Compute the bounding box
bbox = get_robust_bounds(geometries)
data = {
"type": "FeatureCollection",
"@context": "https://raw.githubusercontent.com/LinkedPasts/linked-places/master/linkedplaces-context-v1.1.jsonld",
"features": features,
"bbox": bbox
}
return data
# print(json.dumps(df2lp(df), indent=2))
#@title 5. Preliminary Visualisation of Linked Places Format Data
#@markdown This cell provides a **lightweight, interactive map** to preview your data in the **Linked Places Format (LPF)**, without the need to fully build the Peripleo web application.
#@markdown **What it does:**
#@markdown - Converts your DataFrame into a **GeoJSON object** using the `df2lp()` function.
#@markdown - Determines a bounding box (if available) to **fit the map to your data**.
#@markdown - Adds **interactive markers** for each place, including:
#@markdown - Title
#@markdown - Description (if available)
#@markdown - Link to the associated website or resource
#@markdown - Uses **Folium**, a Python mapping library, to render an **interactive map** directly in the notebook.
#@markdown This is ideal for quickly inspecting your dataset visually before publishing it in Peripleo.
# Import the necessary libraries
import folium
import json
# Assuming df2lp(df) is defined and returns a GeoJSON object
# with a bbox property.
geojson_data = df2lp(df)
# Check if a bbox exists in the GeoJSON data
if 'bbox' in geojson_data and geojson_data['bbox']:
bbox = geojson_data['bbox']
# Folium's fit_bounds() expects a list of [[min_lat, min_lon], [max_lat, max_lon]]
folium_bounds = [[bbox[1], bbox[0]], [bbox[3], bbox[2]]]
# 1. Initialize the map with the bounding box
folium_map = folium.Map()
folium_map.fit_bounds(folium_bounds)
else:
# Fallback to a default location if no bbox is available
folium_map = folium.Map(location=[51.509, -0.12], zoom_start=12)
# 2. Loop through each GeoJSON feature to add a marker
for feature in geojson_data['features']:
# Extract properties and coordinates
properties = feature['properties']
coordinates = feature['geometry']['coordinates']
# GeoJSON coordinates are [longitude, latitude]
lat = coordinates[1]
lon = coordinates[0]
# Use an f-string to embed the title and URL into HTML
popup_html = f"""
<h3>{properties['title']}</h3>
<p>{properties.get('description', 'No description available.')}</p>
<a href="{properties['url']}" target="_blank">Visit website</a>
"""
# Create the Folium popup object with the HTML
popup = folium.Popup(popup_html, max_width=300)
# Add a marker to the map
folium.Marker(
location=[lat, lon],
popup=popup,
tooltip=properties['title'],
icon=folium.Icon(color='blue', icon='info-sign')
).add_to(folium_map)
# 3. Display the map
folium_map
#@title 6. Make Peripleo app available on Github Pages
#@markdown Please enter your GitHub username and a name for the new repository:
import os
import requests
import time
from github import Github, GithubException, Auth
import json
from google.colab import userdata
# --- 1. SET YOUR GITHUB REPOSITORY ---
GITHUB_USERNAME = "your-github-username-here" #@param {type:"string"}
REPO_NAME = "test-peripleo-app" #@param {type:"string"}
SOURCE_REPO_OWNER = "britishlibrary"
SOURCE_REPO_NAME = "peripleo"
GEOJSON_FILE_PATH = "docs/data/historical_data.geojson"
CONFIG_FILE_PATH = "docs/peripleo.config.json"
# --- 2. ACCESS YOUR GITHUB TOKEN FROM COLAB SECRETS ---
try:
github_token = userdata.get('GITHUB_TOKEN')
except KeyError:
raise ValueError("GITHUB_TOKEN not found. Please set it as a Colab secret.")
# --- 3. PREPARE THE API REQUESTS ---
# Headers for the template import API call (requires special media type)
template_headers = {
"Authorization": f"token {github_token}",
"Accept": "application/vnd.github.baptiste-preview+json"
}
# Standard headers for all other authenticated API calls
standard_headers = {
"Authorization": f"token {github_token}",
"Accept": "application/vnd.github.v3+json"
}
repo_data = {
"owner": GITHUB_USERNAME,
"name": REPO_NAME,
"description": "Peripleo map generated from Colab",
"private": False
}
template_url = f"https://api.github.com/repos/{SOURCE_REPO_OWNER}/{SOURCE_REPO_NAME}/generate"
# --- 4. AUTHENTICATE WITH PYGITHUB ---
auth = Auth.Token(github_token)
g = Github(auth=auth)
user = g.get_user()
print(f"Authenticated as: {user.login}")
# --- 5. AUTOMATED WORKFLOW ---
print(f"--- Step 1: Importing '{SOURCE_REPO_NAME}' as a new private repository named '{REPO_NAME}' ---")
try:
new_repo = user.get_repo(REPO_NAME)
print(f"Repository '{REPO_NAME}' already exists under your account. Skipping import.")
except GithubException as e:
if e.status == 404:
print(f"Repository '{REPO_NAME}' not found. Importing from template now...")
response = requests.post(template_url, headers=template_headers, json=repo_data)
response.raise_for_status()
repo_found = False
while not repo_found:
try:
new_repo = user.get_repo(REPO_NAME)
repo_found = True
except GithubException:
print("Waiting for repository creation...", end="\r")
time.sleep(5)
print(f"\nNew repository '{REPO_NAME}' created successfully at {new_repo.html_url}")
# --- NEW: Programmatic check for template content ---
print("\nWaiting for template files to be imported...")
content_imported = False
while not content_imported:
try:
# Check for the existence of the main index file as a proxy for completion
new_repo.get_contents("docs/index.html")
content_imported = True
except GithubException:
print("Waiting for template content...", end="\r")
time.sleep(5)
print("\nTemplate files have been successfully imported.")
else:
raise e
# --- 6. PREPARE YOUR GEOJSON DATA ---
# This is done in a previous cell
# --- 7. ADDING YOUR DATA TO THE REPOSITORY ---
print("\n--- Step 2: Adding your data to the repository ---")
# Create GeoJSON file content
geojson_content = json.dumps(geojson_data, indent=2)
geojson_commit_message = f"Automated update of {GEOJSON_FILE_PATH}"
try:
contents = new_repo.get_contents(GEOJSON_FILE_PATH)
new_repo.update_file(
path=GEOJSON_FILE_PATH,
message=geojson_commit_message,
content=geojson_content,
sha=contents.sha
)
print(f"Successfully updated {GEOJSON_FILE_PATH}.")
except GithubException as e:
if e.status == 404:
new_repo.create_file(
path=GEOJSON_FILE_PATH,
message=geojson_commit_message,
content=geojson_content
)
print(f"Successfully created {GEOJSON_FILE_PATH}.")
else:
raise e
# Create or update the Peripleo config.json file
config_data = {
"initial_bounds": geojson_data['bbox'],
"map_style": "./map-style-OSM.json",
"data": [
{
"name": "Disambiguation Test",
"format": "LINKED_PLACES",
"src": "./data/historical_data.geojson",
"attribution": "Leiden Workshop"
}
],
"facets": [
"type"
],
"link_icons": [
{ "pattern": "maps.google.com", "img": "./logos/maps.google.com.png", "label": "Google Maps" },
{ "pattern": "www.geograph.org.uk", "img": "./logos/geograph.org.png", "label": "Geograph" },
{ "pattern": "en.wikipedia.org", "img": "./logos/en.wikipedia.org.png", "label": "Wikipedia" },
{ "pattern": "www.wikidata.org", "img": None, "label": "Wikidata" },
{ "pattern": "www.geonames.org", "img": None, "label": "GeoNames" },
{ "pattern": "sws.geonames.org", "img": None, "label": "GeoNames" }
]
}
# Convert the Python dictionary to a JSON string with indentation for readability
config_content = json.dumps(config_data, indent=2)
config_commit_message = f"Automated update of {CONFIG_FILE_PATH}"
try:
contents = new_repo.get_contents(CONFIG_FILE_PATH)
new_repo.update_file(
path=CONFIG_FILE_PATH,
message=config_commit_message,
content=config_content,
sha=contents.sha
)
print(f"Successfully updated {CONFIG_FILE_PATH}.")
except GithubException as e:
if e.status == 404:
new_repo.create_file(
path=CONFIG_FILE_PATH,
message=config_commit_message,
content=config_content
)
print(f"Successfully created {CONFIG_FILE_PATH}.")
else:
raise e
# --- 8. ENABLE GITHUB PAGES ---
print("\n--- Step 3: Enabling GitHub Pages ---")
pages_url = f"https://api.github.com/repos/{GITHUB_USERNAME}/{REPO_NAME}/pages"
pages_data = {
"source": {
"branch": "main",
"path": "/docs"
}
}
try:
response = requests.post(pages_url, headers=standard_headers, json=pages_data)
response.raise_for_status()
# Wait for Pages to become active
pages_active = False
print(f"You can check the build status here: https://github.com/{GITHUB_USERNAME}/{REPO_NAME}/actions")
while not pages_active:
try:
pages_status_url = f"https://api.github.com/repos/{GITHUB_USERNAME}/{REPO_NAME}/pages"
pages_status = requests.get(pages_status_url, headers=standard_headers).json()
if pages_status.get('status') == 'built':
pages_active = True
print("\nGitHub Pages enabled successfully.")
except Exception:
pass # Keep waiting
print("Waiting for GitHub Pages to be provisioned...", end="\r")
time.sleep(10)
print(f"\nDeployment process complete!")
print(f"Your repository is at: https://github.com/{GITHUB_USERNAME}/{REPO_NAME}")
print(f"Your Peripleo map should be available shortly at: {pages_status.get('html_url')}")
except requests.exceptions.RequestException as e:
print(f"Error enabling GitHub Pages: {e}. You may need to enable it manually.")
#@title 7. Map Data Download (optional)
#@markdown Run the cell and then click the button which appears below to download a GeoJSON representation of the locations mentioned in your original CSV file.
#@markdown GeoJSON is a standard format for representing geographic data with a simple structure of points, lines, and polygons.
import json
import base64
from IPython.display import HTML, display
# Convert the GeoJSON object to a formatted string
geojson_string = json.dumps(geojson_data, indent=2)
# Encode the GeoJSON string to base64 for a data URI
encoded_geojson = base64.b64encode(geojson_string.encode('utf-8')).decode('utf-8')
# HTML for a styled download button
button_html = f"""
<a href="data:application/json;charset=utf-8;base64,{encoded_geojson}"
download="historical_data.geojson"
style="
background-color: #4CAF50;
color: white;
padding: 12px 28px;
text-align: center;
text-decoration: none;
display: inline-block;
font-size: 16px;
margin: 6px 2px;
cursor: pointer;
border-radius: 8px;
border: none;
font-weight: bold;
">
📥 Download Historical Data (GeoJSON)
</a>
"""
# Display the button in Colab
display(HTML(button_html))