UK Fuel Finder

The UK Government has managed to get the petrol stations to update their fuel prices centrally. I’ve built this Node-RED flow to grab the data and find the cheapest petrol station within x miles of a lat/long co-ordinate and output a HA friendly sensor message (that can drop into a Node-RED HA companion sensor of your choosing). It has attributes giving the cheapest station for different fuel types, and another attribute with a list of nearby stations and their details (sorted by distance away).

You’ll need a UK OneGov login to read all the documentation and an OAuth token from here: UK Fuel Finder, create an application to get your client_id and client_secret then copy them into the CONFIG section of the first function node in the flow - add in your Lat/Long (you should have them in zone.home in HA) and how many miles away you want the geofence casting for stations to check.

The way the API works is you download in batches ALL the stations in the UK (but this flow only stores those nearby), then you refresh price updates (in this flow every three hours, but you can do it every few minutes if you want to - although station updates take 30 minutes to get into the database).

Once the initial batch download is done - the price changes are very quick. You only need to do batch downloads when new stations are added or dropped, or they change names or details. Once a week, or overnight should be fine(!)

The flow takes care of the OAuth token management so you can install and forget. The batch download takes some time (a minute or so) - I’ve put triggers in for doing it in the small hours of the night. I’ve added rate limits in as well to make sure it doesn’t hammer the API excessively.

If you want to play with the formatting of the data into HA - it’s all done in the final node of the flow. I’ve also stuck a signature on the end of the flow so updates only hit HA when there is a data change - you can strip that out if you need HA to get constant updates every time the price trigger fires even if there is no change.

[
    {
        "id": "ff_inject_prices_3h",
        "type": "inject",
        "z": "ff_tab_01",
        "name": "Poll prices (every 3h)",
        "props": [
            {
                "p": "topic",
                "vt": "str"
            }
        ],
        "repeat": "10800",
        "crontab": "",
        "once": true,
        "onceDelay": 10,
        "topic": "prices",
        "x": 480,
        "y": 280,
        "wires": [
            [
                "ff_fn_start"
            ]
        ]
    },
    {
        "id": "ff_inject_stations_daily",
        "type": "inject",
        "z": "ff_tab_01",
        "name": "Poll stations (daily)",
        "props": [
            {
                "p": "topic",
                "vt": "str"
            }
        ],
        "repeat": "",
        "crontab": "10 3 * * *",
        "once": true,
        "onceDelay": 20,
        "topic": "stations",
        "x": 490,
        "y": 320,
        "wires": [
            [
                "ff_fn_start"
            ]
        ]
    },
    {
        "id": "ff_inject_manual",
        "type": "inject",
        "z": "ff_tab_01",
        "name": "Manual (stations+prices)",
        "props": [
            {
                "p": "topic",
                "vt": "str"
            }
        ],
        "repeat": "",
        "crontab": "",
        "once": false,
        "topic": "manual",
        "x": 470,
        "y": 360,
        "wires": [
            [
                "ff_fn_start"
            ]
        ]
    },
    {
        "id": "ff_fn_start",
        "type": "function",
        "z": "ff_tab_01",
        "name": "Refresh Token or Call API",
        "func": "/**\n * V1.0 UK Fuel Finder API\n * Phil\n * CONFIG - EDIT THESE VALUES\n */\nconst CONFIG = {\n    client_id: \"YOUR_CLIENT_ID\",\n    client_secret: \"YOUR_CLIENT_SECRET\",\n    home_lat: YOUR_LAT,\n    home_lon: YOURL_LONG,\n    radius_miles: 10\n};\n\nconst OAUTH_KEY = \"ff:oauth\";\nconst TOKEN_SKEW = 30;\n\nfunction now() { return Math.floor(Date.now() / 1000); }\n\nfunction pad(n) { return String(n).padStart(2, \"0\"); }\n\nfunction effectiveStartFromMaxTs(maxTs, minutesBack = 30) {\n    if (!maxTs) return null;\n    const d = new Date(maxTs);\n    if (isNaN(d.getTime())) return null;\n    d.setMinutes(d.getMinutes() - minutesBack);\n    return `${d.getFullYear()}-${pad(d.getMonth() + 1)}-${pad(d.getDate())} ${pad(d.getHours())}:${pad(d.getMinutes())}:${pad(d.getSeconds())}`;\n}\n\n// Store config in flow for other nodes\nflow.set('ffConfig', CONFIG);\n\nconst topic = msg.topic || \"\";\nlet kind = topic === \"manual\" ? \"stations\" : topic;\n\nif (kind !== \"stations\" && kind !== \"prices\") {\n    node.error(\"Invalid topic\");\n    return null;\n}\n\nmsg.kind = kind;\nmsg.batch = 1;\n\n// Check if we need to get/refresh token\nconst cached = flow.get(OAUTH_KEY);\nif (cached?.access_token && cached?.expires_at && now() < (cached.expires_at - TOKEN_SKEW)) {\n    msg.token = cached.access_token;\n    node.status({ fill: \"green\", shape: \"dot\", text: `creating API request` });\n    return [null, msg]; // Already have valid token, go to API request\n}\n\n// Need to get token\nconst useRefresh = !!cached?.refresh_token;\nmsg.method = \"POST\";\nmsg.url = useRefresh\n    ? \"https://www.fuel-finder.service.gov.uk/api/v1/oauth/regenerate_access_token\"\n    : \"https://www.fuel-finder.service.gov.uk/api/v1/oauth/generate_access_token\";\n\nmsg.headers = { \"content-type\": \"application/json\", \"accept\": \"application/json\" };\nmsg.payload = useRefresh\n    ? { client_id: CONFIG.client_id, refresh_token: cached.refresh_token }\n    : { client_id: CONFIG.client_id, client_secret: CONFIG.client_secret };\n\nnode.status({ fill: \"blue\", shape: \"dot\", text: `refreshing OAuth token` });\n\nreturn [msg, null]; // Go get token",
        "outputs": 2,
        "timeout": "",
        "noerr": 4,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 790,
        "y": 300,
        "wires": [
            [
                "c6fca3d87018f53a"
            ],
            [
                "ff_fn_build_api_request"
            ]
        ]
    },
    {
        "id": "ff_http",
        "type": "http request",
        "z": "ff_tab_01",
        "name": "Make HTTP Call",
        "method": "use",
        "ret": "obj",
        "paytoqs": "ignore",
        "url": "",
        "tls": "",
        "persist": false,
        "proxy": "",
        "insecureHTTPParser": false,
        "authType": "",
        "senderr": false,
        "headers": [],
        "x": 1240,
        "y": 300,
        "wires": [
            [
                "ff_fn_response_handler"
            ]
        ]
    },
    {
        "id": "ff_fn_response_handler",
        "type": "function",
        "z": "ff_tab_01",
        "name": "New Token or API Data",
        "func": "const OAUTH_KEY = \"ff:oauth\";\n\nfunction now() { return Math.floor(Date.now() / 1000); }\n\n// If this is OAuth response\nif (msg.url && msg.url.includes('/oauth/')) {\n    const p = msg.payload;\n    const data = (p.data && typeof p.data === \"object\") ? p.data : p;\n    const access = data.access_token;\n    const expiresIn = Number(data.expires_in || 3600);\n    const refresh = data.refresh_token;\n\n    if (!access) {\n        node.error(\"No access_token in OAuth response\");\n        return null;\n    }\n\n    const cached = flow.get(OAUTH_KEY) || {};\n    flow.set(OAUTH_KEY, {\n        access_token: access,\n        expires_at: now() + Math.max(60, expiresIn),\n        refresh_token: refresh || cached.refresh_token || null\n    });\n\n    msg.token = access;\n    node.status({ fill: \"green\", shape: \"dot\", text: `token ok, initiating API request` });\n    return [msg, null]; // Now build API request\n}\n\n// This is API data response - accumulate batches\nconst batch = Array.isArray(msg.payload) ? msg.payload : [];\nconst kind = msg.kind;\n\n// Get accumulated data\nconst accumulated = flow.get(`ff_accumulated_${kind}`) || [];\naccumulated.push(...batch);\nflow.set(`ff_accumulated_${kind}`, accumulated);\n\nmsg.batchSize = batch.length;\n\nif (batch.length < 500) {\n    // Last batch - send all accumulated data for processing\n    msg.payload = accumulated;\n    flow.set(`ff_accumulated_${kind}`, []); // Clear\n    node.status({ fill: \"green\", shape: \"dot\", text: `batches built, send to process` });\n    return [null, msg];\n} else {\n    // More batches available\n    msg.batch = (msg.batch || 1) + 1;\n    node.status({ fill: \"blue\", shape: \"dot\", text: `requesting batch ${msg.batch}` });\n    return [msg, null];\n}",
        "outputs": 2,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 1470,
        "y": 300,
        "wires": [
            [
                "ff_delay_between_batches"
            ],
            [
                "ff_fn_process_data"
            ]
        ]
    },
    {
        "id": "ff_delay_between_batches",
        "type": "delay",
        "z": "ff_tab_01",
        "name": "Rate Limit",
        "pauseType": "rate",
        "timeout": "5",
        "timeoutUnits": "seconds",
        "rate": "1",
        "nbRateUnits": "1",
        "rateUnits": "second",
        "randomFirst": "1",
        "randomLast": "5",
        "randomUnits": "seconds",
        "drop": false,
        "allowrate": false,
        "outputs": 1,
        "x": 1680,
        "y": 260,
        "wires": [
            [
                "ff_link_to_api_builder"
            ]
        ]
    },
    {
        "id": "ff_link_to_api_builder",
        "type": "link out",
        "z": "ff_tab_01",
        "name": "To FF API builder",
        "mode": "link",
        "links": [
            "ff_link_from_delay"
        ],
        "x": 1795,
        "y": 260,
        "wires": []
    },
    {
        "id": "ff_link_from_delay",
        "type": "link in",
        "z": "ff_tab_01",
        "name": "From FF Delay Next Request",
        "links": [
            "ff_link_to_api_builder"
        ],
        "x": 885,
        "y": 360,
        "wires": [
            [
                "ff_fn_build_api_request"
            ]
        ]
    },
    {
        "id": "ff_fn_build_api_request",
        "type": "function",
        "z": "ff_tab_01",
        "name": "Build API request",
        "func": "const token = msg.token;\nif (!token) {\n    node.error(\"No token available\");\n    return null;\n}\n\nfunction pad(n) { return String(n).padStart(2, \"0\"); }\n\nfunction effectiveStartFromMaxTs(maxTs, minutesBack = 30) {\n    if (!maxTs) return null;\n    const d = new Date(maxTs);\n    if (isNaN(d.getTime())) return null;\n    d.setMinutes(d.getMinutes() - minutesBack);\n    return `${d.getFullYear()}-${pad(d.getMonth() + 1)}-${pad(d.getDate())} ${pad(d.getHours())}:${pad(d.getMinutes())}:${pad(d.getSeconds())}`;\n}\n\nconst kind = msg.kind;\nconst batch = msg.batch || 1;\n\nlet path, qs;\n\nif (kind === \"stations\") {\n    path = \"/api/v1/pfs\";\n    qs = { \"batch-number\": batch };\n    \n    // Clear caches on first batch\n    if (batch === 1) {\n        flow.set('ffNearbyStationsById', {});\n        flow.set('ffNearbyIds', []);\n        flow.set('ff_accumulated_stations', []);\n    }\n} else if (kind === \"prices\") {\n    path = \"/api/v1/pfs/fuel-prices\";\n    const sinceTs = flow.get(\"ffPricesMaxTs\") || null;\n    const eff = effectiveStartFromMaxTs(sinceTs, 30);\n    qs = { \"batch-number\": batch };\n    if (eff) qs[\"effective-start-timestamp\"] = eff;\n    \n    // Clear cache on first batch\n    if (batch === 1) {\n        flow.set('ff_accumulated_prices', []);\n    }\n}\n\nconst url = new URL(`https://www.fuel-finder.service.gov.uk${path}`);\nfor (const [k, v] of Object.entries(qs)) {\n    if (v !== undefined && v !== null && String(v).length) {\n        url.searchParams.set(k, String(v));\n    }\n}\n\nmsg.method = \"GET\";\nmsg.url = url.toString();\nmsg.headers = { accept: \"application/json\", authorization: `Bearer ${token}` };\ndelete msg.payload;\n\nnode.status({ fill: \"blue\", shape: \"dot\", text: `${kind} batch ${batch}` });\n\nreturn msg;",
        "outputs": 1,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 1030,
        "y": 320,
        "wires": [
            [
                "ff_http"
            ]
        ]
    },
    {
        "id": "ff_fn_process_data",
        "type": "function",
        "z": "ff_tab_01",
        "name": "Process Stations and Prices Data",
        "func": "const CONFIG = flow.get('ffConfig') || {};\nconst kind = msg.kind;\nconst batch = Array.isArray(msg.payload) ? msg.payload : [];\n\nfunction weekdayLondonLower() {\n    const d = new Date(new Date().toLocaleString('en-GB', { timeZone: 'Europe/London' }));\n    return ['sunday', 'monday', 'tuesday', 'wednesday', 'thursday', 'friday', 'saturday'][d.getDay()];\n}\n\nfunction openingToday(station) {\n    const day = weekdayLondonLower();\n    const d = station?.opening_times?.usual_days?.[day];\n    if (!d) return null;\n    if (d.is_24_hours) return '24h';\n    const open = String(d.open || '').slice(0, 5);\n    const close = String(d.close || '').slice(0, 5);\n    if (!open || !close || (open === '00:00' && close === '00:00')) return null;\n    return `${open}-${close}`;\n}\n\nfunction toRad(d) { return d * Math.PI / 180; }\n\nfunction havKm(aLat, aLon, bLat, bLon) {\n    const R = 6371;\n    const dLat = toRad(bLat - aLat);\n    const dLon = toRad(bLon - aLon);\n    const x = Math.sin(dLat / 2) ** 2 + Math.cos(toRad(aLat)) * Math.cos(toRad(bLat)) * (Math.sin(dLon / 2) ** 2);\n    return 2 * R * Math.asin(Math.min(1, Math.sqrt(x)));\n}\n\nif (kind === \"stations\") {\n    const radiusKm = CONFIG.radius_miles * 1.609344;\n    const byId = flow.get('ffNearbyStationsById') || {};\n    const ids = new Set(flow.get('ffNearbyIds') || []);\n\n    for (const s of batch) {\n        const id = s?.node_id;\n        if (!id) continue;\n\n        if (s.temporary_closure === true || s.permanent_closure === true) {\n            ids.delete(id);\n            delete byId[id];\n            continue;\n        }\n\n        const lat = Number(s?.location?.latitude);\n        const lon = Number(s?.location?.longitude);\n        if (!Number.isFinite(lat) || !Number.isFinite(lon)) continue;\n\n        const km = havKm(CONFIG.home_lat, CONFIG.home_lon, lat, lon);\n        if (km > radiusKm) {\n            ids.delete(id);\n            delete byId[id];\n            continue;\n        }\n\n        byId[id] = {\n            id,\n            name: s.trading_name || s.brand_name || '',\n            brand: s.brand_name || '',\n            postcode: s?.location?.postcode || '',\n            lat, lon,\n            miles: Number((km / 1.609344).toFixed(2)),\n            open_today: openingToday(s),\n            is_mss: !!s.is_motorway_service_station,\n            is_supermarket: !!s.is_supermarket_service_station,\n            fuel_types: Array.isArray(s.fuel_types) ? s.fuel_types : [],\n            amenities: Array.isArray(s.amenities) ? s.amenities : [],\n        };\n        ids.add(id);\n    }\n\n    flow.set('ffNearbyStationsById', byId);\n    flow.set('ffNearbyIds', Array.from(ids));\n\n    // Clean up accumulation array\n    flow.set('ff_accumulated_stations', undefined);\n\n    node.status({ fill: \"green\", shape: \"dot\", text: `${ids.size} stations` });\n\n    // Trigger prices refresh\n    return { topic: \"prices\", payload: Date.now() };\n}\n\nif (kind === \"prices\") {\n    const nearbyIds = new Set(flow.get('ffNearbyIds') || []);\n    if (!nearbyIds.size) {\n        node.warn(\"No stations loaded yet, skipping prices\");\n        // Clean up accumulation array even on error\n        flow.set('ff_accumulated_prices', undefined);\n        return null;\n    }\n\n    const byId = flow.get('ffNearbyPricesById') || {};\n    let maxTs = flow.get('ffPricesMaxTs') || null;\n\n    for (const rec of batch) {\n        const id = rec?.node_id;\n        if (!id || !nearbyIds.has(id)) continue;\n\n        const fuels = Array.isArray(rec.fuel_prices) ? rec.fuel_prices : [];\n        const cur = byId[id] || {};\n\n        for (const f of fuels) {\n            const ft = f?.fuel_type;\n            if (!ft) continue;\n\n            const price = (f.price === null || f.price === undefined || f.price === '') ? null : Number(f.price);\n            const ts = f.price_last_updated || null;\n\n            cur[ft] = { price, ts };\n            if (ts && (!maxTs || ts > maxTs)) maxTs = ts;\n        }\n\n        byId[id] = cur;\n    }\n\n    flow.set('ffNearbyPricesById', byId);\n    if (maxTs) flow.set('ffPricesMaxTs', maxTs);\n\n    // Clean up accumulation array\n    flow.set('ff_accumulated_prices', undefined);\n\n    node.status({ fill: \"green\", shape: \"dot\", text: `prices updated` });\n\n    // Trigger HA build\n    return { topic: \"build_ha\", payload: Date.now() };\n}\n\nreturn null;",
        "outputs": 1,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 1760,
        "y": 340,
        "wires": [
            [
                "ff_fn_router"
            ]
        ]
    },
    {
        "id": "ff_fn_router",
        "type": "switch",
        "z": "ff_tab_01",
        "name": "Route: prices or HA build",
        "property": "topic",
        "propertyType": "msg",
        "rules": [
            {
                "t": "eq",
                "v": "prices",
                "vt": "str"
            },
            {
                "t": "eq",
                "v": "build_ha",
                "vt": "str"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 2,
        "x": 2050,
        "y": 340,
        "wires": [
            [
                "ff_link_prices_trigger"
            ],
            [
                "ff_fn_build_ha_payload"
            ]
        ]
    },
    {
        "id": "ff_link_prices_trigger",
        "type": "link out",
        "z": "ff_tab_01",
        "name": "Trigger prices",
        "mode": "link",
        "links": [
            "ff_link_to_start_inject"
        ],
        "x": 2215,
        "y": 320,
        "wires": []
    },
    {
        "id": "ff_link_to_start_inject",
        "type": "link in",
        "z": "ff_tab_01",
        "name": "To start node",
        "links": [
            "ff_link_prices_trigger"
        ],
        "x": 565,
        "y": 240,
        "wires": [
            [
                "ff_fn_start"
            ]
        ]
    },
    {
        "id": "ff_fn_build_ha_payload",
        "type": "function",
        "z": "ff_tab_01",
        "name": "Build HA payload + signature",
        "func": "const stationsById = flow.get('ffNearbyStationsById') || {};\nconst pricesById = flow.get('ffNearbyPricesById') || {};\nconst ids = Object.keys(stationsById);\n\nconst joined = [];\nfor (const id of ids) {\n    const s = stationsById[id];\n    const p = pricesById[id] || {};\n\n    // Skip stations with no price data at all\n    const hasAnyPrice = p.E10?.price || p.B7_STANDARD?.price;\n    if (!hasAnyPrice) continue;\n\n    joined.push({\n        id,\n        name: s.name,\n        postcode: s.postcode,\n        miles: s.miles,\n        open_today: s.open_today,\n        prices: p,\n    });\n}\n\n// Phil's smart to Title Case\nfunction toTitleCase(str) {\n    if (!str || typeof str !== \"string\") return \"\";\n\n    // Extract HTML entities\n    const entities = [];\n    const placeholder = \"\\uFFF0\";\n    str = str.replace(/&[#a-zA-Z0-9]+;/g, (m) => {\n        entities.push(m);\n        return `${placeholder}${entities.length - 1}${placeholder}`;\n    });\n\n    const smallWordsStr = \"a|an|and|as|at|but|by|en|for|if|in|of|on|or|the|to|v\\\\.?|via|vs\\\\.?|de|di\";\n    const smallWordsRegex = new RegExp(`\\\\b(${smallWordsStr})\\\\b`, \"gi\");\n    const punctuation = /([!\\\"#$%&'()*+,./:;<=>?@[\\\\\\]^_`{|}~-]*)/;\n    const sentenceBoundaries = /[:.;?!] |(?: |^)[\"']/g;\n\n    const parts = [];\n    let lastIndex = 0;\n    let match;\n\n    while ((match = sentenceBoundaries.exec(str)) !== null) {\n        parts.push(processPart(str.substring(lastIndex, match.index)));\n        parts.push(match[0]);\n        lastIndex = sentenceBoundaries.lastIndex;\n    }\n    parts.push(processPart(str.substring(lastIndex)));\n\n    let result = parts\n        .join(\"\")\n        .replace(/ V(s?)\\. /gi, \" v$1. \")\n        .replace(/(['’])[Ss]\\b/g, \"$1s\")\n        .replace(\n            /\\b(Q&A|F1|F2|F3|TV|USA|UK|US|CEO|CFO|CTO|IT|AI|API|URL|HTML|CSS|JS|PDF|XML|JSON)\\b/gi,\n            (m) => m.toUpperCase()\n        );\n\n    // Restore entities\n    result = result.replace(new RegExp(`${placeholder}(\\\\d+)${placeholder}`, \"g\"), (_, i) => entities[i]);\n\n    return result;\n\n    function processPart(part) {\n        if (!part) return \"\";\n\n        return part\n            .replace(/\\b([a-z][a-z.']*)\\b/gi, (w) => (/[a-z]\\.[a-z]/i.test(w) ? w : capWord(w)))\n            .replace(smallWordsRegex, (m, word, offset, string) => {\n                const isFirst = offset === 0;\n                const isLast = offset + m.length === string.length;\n                return (isFirst || isLast) ? capWord(word) : String(word).toLowerCase();\n            })\n            .replace(new RegExp(`^${punctuation.source}\\\\b(${smallWordsStr})\\\\b`, \"gi\"), (all, punct, word) => punct + capWord(word))\n            .replace(new RegExp(`\\\\b(${smallWordsStr})\\\\b${punctuation.source}$`, \"gi\"), (all, word, punct) => capWord(word) + punct);\n    }\n\n    function capWord(word) {\n        word = String(word);\n        return word.charAt(0).toUpperCase() + word.slice(1).toLowerCase();\n    }\n}\n\n// Sort by distance (closest first)\njoined.sort((a, b) => (a.miles || 999) - (b.miles || 999));\n\nfunction cheapest(ft) {\n    let best = null;\n    for (const r of joined) {\n        const v = r.prices?.[ft]?.price;\n        if (typeof v !== 'number' || !isFinite(v)) continue;\n        if (!best || v < best.price) {\n            best = {\n                id: r.id,\n                name: toTitleCase(r.name),\n                postcode: r.postcode,\n                miles: r.miles,\n                price: Number(v.toFixed(1)),\n                ts: r.prices?.[ft]?.ts || null\n            };\n        }\n    }\n    return best;\n}\n\nconst bestE10 = cheapest('E10');\nconst bestB7 = cheapest('B7_STANDARD');\n\n// Build stations array with all relevant data\nconst stationsArray = joined.map(r => ({\n    id: r.id,\n    name: toTitleCase(r.name),\n    postcode: r.postcode,\n    miles: r.miles,\n    open_today: r.open_today,\n    e10_price: r.prices?.E10?.price ? Number(r.prices.E10.price.toFixed(1)) : null,\n    e10_updated: r.prices?.E10?.ts || null,\n    b7_price: r.prices?.B7_STANDARD?.price ? Number(r.prices.B7_STANDARD.price.toFixed(1)) : null,\n    b7_updated: r.prices?.B7_STANDARD?.ts || null\n}));\n\n// Simple signature without crypto\nconst model = {\n    count: joined.length,\n    best_e10_price: bestE10?.price || null,\n    best_b7_price: bestB7?.price || null,\n    station_ids: joined.map(r => r.id).join(',')\n};\n\nmsg.signature = JSON.stringify(model);\n\nmsg.payload = {\n    state: joined.length,\n    attributes: {\n        icon: 'mdi:gas-station',\n        best_e10: bestE10,\n        best_b7: bestB7,\n        stations: stationsArray,\n        prices_max_ts: flow.get('ffPricesMaxTs') || null\n    }\n};\nnode.status({ fill: \"green\", shape: \"dot\", text: `payload completed` });\n\nreturn msg;",
        "outputs": 1,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 2320,
        "y": 360,
        "wires": [
            [
                "ff_rbe_signature"
            ]
        ]
    },
    {
        "id": "ff_rbe_signature",
        "type": "rbe",
        "z": "ff_tab_01",
        "name": "RBE (signature)",
        "func": "rbe",
        "gap": "",
        "start": "",
        "inout": "out",
        "property": "signature",
        "x": 2560,
        "y": 360,
        "wires": [
            [
                "ff_debug_output"
            ]
        ]
    },
    {
        "id": "ff_debug_output",
        "type": "debug",
        "z": "ff_tab_01",
        "name": "Output (wire to HA sensor)",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "payload",
        "targetType": "msg",
        "x": 2800,
        "y": 360,
        "wires": []
    },
    {
        "id": "c6fca3d87018f53a",
        "type": "fan",
        "z": "ff_tab_01",
        "name": "Refresh OA Token",
        "x": 1030,
        "y": 280,
        "wires": [
            [
                "ff_http"
            ]
        ]
    },
    {
        "id": "04684680a8264b7a",
        "type": "global-config",
        "env": [],
        "modules": {
            "node-red-contrib-fan": "1.0.2"
        }
    }
]
2 Likes

Here is a python version that can run as a comand_line sensor. It has all the command line options to override the config, and to force a refresh of the local stations, and it can read the config from a json file as well so you can adapt to however you want to secure your API credentials.

There is a more recent version of this script on GitHub - see below.

UK Fuel Finder - Home Assistant Sensor

Fetch nearby petrol station prices from the UK Government’s Fuel Finder API.

Features

  • Automatic OAuth token management with refresh
  • Finds stations within configurable radius
  • Incremental price updates (only fetches changes since last run)
  • Shows cheapest E10 and B7 diesel nearby
  • Station details including opening hours, distance, postcode
  • Runs as Home Assistant command_line sensor
  • Use --refresh-stations to force a stations refresh.
  • Supports overriding config via --config (JSON) and/or command-line flags.

Requirements

  • UK Government Fuel Finder API credentials (Sign up here)
  • Python 3.7+

Installation

  1. Get API Credentials

  2. Install the Script
    Copy uk_fuel_finder.py to your Home Assistant config directory or scripts directory (/config/scripts or /config/python or wherever you choose)

  3. Configure the Script
    Edit uk_fuel_finder.py and update the CONFIG section:

    DEFAULT_CONFIG = {
     "client_id": "YOUR_CLIENT_ID",
     "client_secret": "YOURCLIENT_SECRET",
     "home_lat": YOUR_LAT,
     "home_lon": YOUR_LON,
     "radius_miles": 10,
     "token_file": "/config/.storage/uk_fuel_finder/token.json",
     "state_file": "/config/.storage/uk_fuel_finder/state.json",
    }
    
  4. Add to Home Assistant

    Add to configuration.yaml:

    command_line:
      - sensor:
          name: "Fuel Finder"
          command: "python3 /config/scripts/fuel_finder.py"
          scan_interval: 10800  # 3 hours
          value_template: "{{ value_json.state }}"
          json_attributes:
            - icon
            - best_e10
            - best_b7
            - stations
            - last_update
    
  5. Restart Home Assistant

Usage

The sensor provides:

  • State: Number of nearby stations with prices
  • Attributes:
    • best_e10: Cheapest E10 station (name, price, miles)
    • best_b7: Cheapest B7 diesel station (name, price, miles)
    • stations: Array of all nearby stations sorted by distance
    • last_update: Timestamp of last update

Station Data Structure

Each station in the stations array contains:

{
  "id": "station_id",
  "name": "Station Name",
  "postcode": "AB12 3CD",
  "miles": 2.3,
  "open_today": "06:00-22:00",
  "e10_price": 142.9,
  "e10_updated": "2025-02-03T10:30:00Z",
  "b7_price": 149.9,
  "b7_updated": "2025-02-03T10:30:00Z"
}

Prices are in pence per litre (e.g., 142.9 = £1.429/litre).

Dashboard Examples

See ha_config_example.yaml for Lovelace dashboard card templates.

How It Works

  1. OAuth Management: Automatically obtains and refreshes API tokens, caching them in /config/.storage/uk_fuel_finder/token.json
  2. Station Scanning: Fetches all UK stations, filters by radius, stores nearby ones
  3. Incremental Updates: Only fetches price changes since last run (tracked in /config/.storage/uk_fuel_finder/state.json)
  4. Data Processing: Merges stations and prices, calculates cheapest options

API Rate Limits

  • Live environment: 120 requests/minute, 10,000 requests/day
  • Script respects limits through batched fetching (500 records per batch)
  • Default scan interval: 3 hours (set in the command_line sensor spec)

Troubleshooting

“No stations loaded yet”

  • First run takes longer as it scans all UK stations
  • Check your lat/lon coordinates are correct
  • Verify radius_miles is reasonable (start with 10)

Authentication errors

  • Verify client_id and client_secret are correct
  • Delete /config/.storage/uk_fuel_finder/token.json to force fresh token

Rebase

  • Delete /config/.storage/uk_fuel_finder/state.json to force full refresh

Permission errors

  • Ensure script is executable: chmod +x /config/scripts/uk_fuel_finder.py
  • Check Home Assistant has write access to .storage directory

Credits

Built using the UK Government Fuel Finder API (https://www.fuel-finder.service.gov.uk/)

#!/usr/bin/env python3
"""
UK Fuel Finder - Home Assistant Integration

Fetches nearby fuel stations and prices from the UK Government Fuel Finder API.

Key behaviours:
- Caches the *nearby stations subset* in the state file, so subsequent runs can skip the full stations download.
- Use --refresh-stations to force a stations refresh.
- Supports overriding config via --config (JSON) and/or command-line flags.

Outputs a single JSON object to stdout for Home Assistant command_line sensors.
"""

import argparse
import json
import sys
import requests
import time
from datetime import datetime, timedelta
from pathlib import Path
from math import radians, cos, sin, asin, sqrt

# =============================================================================
# DEFAULT CONFIGURATION
# =============================================================================
DEFAULT_CONFIG = {
    "client_id": "YOUR_CLIENT_ID",
    "client_secret": "YOURCLIENT_SECRET",
    "home_lat": YOUR_LAT,
    "home_lon": YOUR_LON,
    "radius_miles": 10,
    "token_file": "/config/.storage/uk_fuel_finder/token.json",
    "state_file": "/config/.storage/uk_fuel_finder/state.json",
}

CONFIG = DEFAULT_CONFIG.copy()

DEFAULT_HTTP = {
    "timeout": 20,          # seconds
    "retries": 5,           # total attempts
    "backoff_base": 1.2,    # exponential base
    "backoff_jitter": 0.3,  # seconds
}

# =============================================================================
# REQUEST WRAPPER
# =============================================================================

def request_with_retry(method, url, *, headers=None, params=None, json_body=None,
                       timeout=DEFAULT_HTTP["timeout"], retries=DEFAULT_HTTP["retries"]):
    """
    Retry on: 429, 500, 502, 503, 504, timeouts, connection resets.
    """
    last_exc = None
    for attempt in range(1, retries + 1):
        try:
            resp = requests.request(
                method,
                url,
                headers=headers,
                params=params,
                json=json_body,
                timeout=timeout,
            )

            # Retryable HTTP status codes
            if resp.status_code in (429, 500, 502, 503, 504):
                # honour Retry-After if present (mainly for 429)
                ra = resp.headers.get("Retry-After")
                if ra:
                    try:
                        sleep_s = float(ra)
                    except ValueError:
                        sleep_s = None
                else:
                    sleep_s = None

                if attempt < retries:
                    if sleep_s is None:
                        sleep_s = (DEFAULT_HTTP["backoff_base"] ** (attempt - 1)) + DEFAULT_HTTP["backoff_jitter"]
                    time.sleep(sleep_s)
                    continue

            resp.raise_for_status()
            return resp

        except (Timeout, ConnectionError, HTTPError) as e:
            last_exc = e
            # If it's an HTTPError, only retry if retryable status
            if isinstance(e, HTTPError):
                status = getattr(e.response, "status_code", None)
                if status not in (429, 500, 502, 503, 504):
                    raise

            if attempt < retries:
                sleep_s = (DEFAULT_HTTP["backoff_base"] ** (attempt - 1)) + DEFAULT_HTTP["backoff_jitter"]
                time.sleep(sleep_s)
                continue
            raise

    # should never reach
    raise last_exc

# =============================================================================
# CONFIG LOADING / CLI
# =============================================================================

def load_config_file(path_str: str) -> dict:
    """Load JSON config file."""
    p = Path(path_str)
    with p.open("r", encoding="utf-8") as f:
        data = json.load(f)
    if not isinstance(data, dict):
        raise ValueError("Config file must contain a JSON object")
    return data


def apply_overrides(cfg: dict, overrides: dict) -> dict:
    """Return cfg with non-None overrides applied."""
    out = cfg.copy()
    for k, v in overrides.items():
        if v is not None:
            out[k] = v
    return out


def parse_args(argv):
    parser = argparse.ArgumentParser(
        description="UK Fuel Finder - fetch nearby fuel station prices (with cached nearby stations)."
    )
    parser.add_argument("--config", help="Path to JSON config file (overrides defaults).")
    parser.add_argument(
        "--refresh-stations",
        action="store_true",
        help="Force refresh of stations list (ignore cached nearby stations).",
    )

    # Config overrides
    parser.add_argument("--client-id", dest="client_id")
    parser.add_argument("--client-secret", dest="client_secret")
    parser.add_argument("--home-lat", dest="home_lat", type=float)
    parser.add_argument("--home-lon", dest="home_lon", type=float)
    parser.add_argument("--radius-miles", dest="radius_miles", type=float)
    parser.add_argument("--token-file", dest="token_file")
    parser.add_argument("--state-file", dest="state_file")

    return parser.parse_args(argv)


def normalise_config(cfg: dict) -> dict:
    """Normalise config types."""
    out = cfg.copy()
    for k in ("home_lat", "home_lon", "radius_miles"):
        if k in out and isinstance(out[k], str):
            try:
                out[k] = float(out[k])
            except ValueError:
                pass
    return out


# =============================================================================
# OAUTH TOKEN MANAGEMENT
# =============================================================================

def load_token():
    """Load cached OAuth token from file"""
    token_file = Path(CONFIG["token_file"])
    if token_file.exists():
        try:
            with token_file.open("r", encoding="utf-8") as f:
                return json.load(f)
        except Exception:
            return None
    return None


def save_token(token_data):
    """Save OAuth token to file"""
    token_file = Path(CONFIG["token_file"])
    token_file.parent.mkdir(parents=True, exist_ok=True)
    with token_file.open("w", encoding="utf-8") as f:
        json.dump(token_data, f)


def get_token():
    """Get valid OAuth token, refreshing if necessary"""
    cached = load_token()

    # Check if cached token is still valid
    if cached and cached.get("access_token") and cached.get("expires_at"):
        expires_at = datetime.fromisoformat(cached["expires_at"])
        if datetime.now() < expires_at - timedelta(seconds=30):
            return cached["access_token"]

    # Need to get new token
    use_refresh = cached and cached.get("refresh_token")

    if use_refresh:
        url = "https://www.fuel-finder.service.gov.uk/api/v1/oauth/regenerate_access_token"
        payload = {
            "client_id": CONFIG["client_id"],
            "refresh_token": cached["refresh_token"],
        }
    else:
        url = "https://www.fuel-finder.service.gov.uk/api/v1/oauth/generate_access_token"
        payload = {
            "client_id": CONFIG["client_id"],
            "client_secret": CONFIG["client_secret"],
        }

    response = requests.post(url, json=payload, headers={"accept": "application/json"})
    response.raise_for_status()

    data = response.json()
    token_data = data.get("data", data)

    access_token = token_data["access_token"]
    expires_in = token_data.get("expires_in", 3600)
    refresh_token = token_data.get("refresh_token", cached.get("refresh_token") if cached else None)

    # Save token
    save_token(
        {
            "access_token": access_token,
            "expires_at": (datetime.now() + timedelta(seconds=expires_in)).isoformat(),
            "refresh_token": refresh_token,
        }
    )

    return access_token


# =============================================================================
# API DATA FETCHING
# =============================================================================

def fetch_all_batches(token, path, params=None):
    """Fetch all paginated batches from API"""
    base_url = "https://www.fuel-finder.service.gov.uk"
    headers = {
        "accept": "application/json",
        "authorization": f"Bearer {token}",
    }

    all_results = []
    batch = 1

    while True:
        query_params = params.copy() if params else {}
        query_params["batch-number"] = batch

        response = request_with_retry(
            "GET",
            f"{base_url}{path}",
            headers=headers,
            params=query_params,
        )
        batch_data = response.json()

        response.raise_for_status()

        batch_data = response.json()
        if not isinstance(batch_data, list):
            batch_data = []

        all_results.extend(batch_data)

        # If batch returned < 500 records, we're done
        if len(batch_data) < 500:
            break

        batch += 1

    return all_results


# =============================================================================
# DISTANCE CALCULATION
# =============================================================================

def haversine_km(lat1, lon1, lat2, lon2):
    """Calculate distance between two points in km"""
    lat1, lon1, lat2, lon2 = map(radians, [lat1, lon1, lat2, lon2])
    dlat = lat2 - lat1
    dlon = lon2 - lon1
    a = sin(dlat / 2) ** 2 + cos(lat1) * cos(lat2) * sin(dlon / 2) ** 2
    c = 2 * asin(sqrt(a))
    return 6371 * c


# =============================================================================
# DATA PROCESSING
# =============================================================================

def get_opening_today(station):
    """Get today's opening hours"""
    days = ["monday", "tuesday", "wednesday", "thursday", "friday", "saturday", "sunday"]
    day = days[datetime.now().weekday()]

    opening_times = station.get("opening_times", {}).get("usual_days", {}).get(day)
    if not opening_times:
        return None

    if opening_times.get("is_24_hours"):
        return "24h"

    open_time = (opening_times.get("open", "") or "")[:5]
    close_time = (opening_times.get("close", "") or "")[:5]

    if not open_time or not close_time or (open_time == "00:00" and close_time == "00:00"):
        return None

    return f"{open_time}-{close_time}"


def process_stations(stations_data):
    """Filter and process stations within radius"""
    radius_km = float(CONFIG["radius_miles"]) * 1.609344
    nearby_stations = {}

    for station in stations_data:
        node_id = station.get("node_id")
        if not node_id:
            continue

        # Skip closed stations
        if station.get("temporary_closure") or station.get("permanent_closure"):
            continue

        location = station.get("location", {})
        lat = location.get("latitude")
        lon = location.get("longitude")

        if lat is None or lon is None:
            continue

        try:
            lat = float(lat)
            lon = float(lon)
        except (ValueError, TypeError):
            continue

        km = haversine_km(float(CONFIG["home_lat"]), float(CONFIG["home_lon"]), lat, lon)
        if km > radius_km:
            continue

        miles = km / 1.609344

        nearby_stations[node_id] = {
            "id": node_id,
            "name": station.get("trading_name") or station.get("brand_name", ""),
            "brand": station.get("brand_name", ""),
            "postcode": location.get("postcode", ""),
            "lat": lat,
            "lon": lon,
            "miles": round(miles, 2),
            "open_today": get_opening_today(station),
            "is_mss": station.get("is_motorway_service_station", False),
            "is_supermarket": station.get("is_supermarket_service_station", False),
            "fuel_types": station.get("fuel_types", []),
            "amenities": station.get("amenities", []),
        }

    return nearby_stations


def load_state():
    """Load previous state for incremental price updates and station caching."""
    state_file = Path(CONFIG["state_file"])
    if state_file.exists():
        try:
            with state_file.open("r", encoding="utf-8") as f:
                data = json.load(f)
            return data if isinstance(data, dict) else {}
        except Exception:
            return {}
    return {}


def save_state(state):
    """Save state for next run"""
    state_file = Path(CONFIG["state_file"])
    state_file.parent.mkdir(parents=True, exist_ok=True)
    with state_file.open("w", encoding="utf-8") as f:
        json.dump(state, f)


def stations_cache_is_usable(state: dict) -> bool:
    """Return True if cached nearby stations look usable for current config."""
    cached = state.get("nearby_stations")
    if not isinstance(cached, dict) or not cached:
        return False

    cfg_sig = {
        "home_lat": float(CONFIG["home_lat"]),
        "home_lon": float(CONFIG["home_lon"]),
        "radius_miles": float(CONFIG["radius_miles"]),
    }
    cached_sig = state.get("stations_config")
    if not isinstance(cached_sig, dict):
        return False

    def close(a, b, tol):
        try:
            return abs(float(a) - float(b)) <= tol
        except Exception:
            return False

    return (
        close(cached_sig.get("home_lat"), cfg_sig["home_lat"], 1e-6)
        and close(cached_sig.get("home_lon"), cfg_sig["home_lon"], 1e-6)
        and close(cached_sig.get("radius_miles"), cfg_sig["radius_miles"], 1e-6)
    )


def process_prices(prices_data, nearby_station_ids):
    """Process price data for nearby stations"""
    prices_by_id = {}
    max_timestamp = None

    for record in prices_data:
        node_id = record.get("node_id")
        if not node_id or node_id not in nearby_station_ids:
            continue

        fuel_prices = record.get("fuel_prices", [])
        station_prices = {}

        for fuel in fuel_prices:
            fuel_type = fuel.get("fuel_type")
            if not fuel_type:
                continue

            price = fuel.get("price")
            if price is None or price == "":
                continue

            try:
                price = float(price)
            except (ValueError, TypeError):
                continue

            timestamp = fuel.get("price_last_updated")

            station_prices[fuel_type] = {"price": price, "timestamp": timestamp}

            if timestamp and (not max_timestamp or timestamp > max_timestamp):
                max_timestamp = timestamp

        if station_prices:
            prices_by_id[node_id] = station_prices

    return prices_by_id, max_timestamp


def build_output(stations, prices):
    """Build final output JSON"""
    result = []

    for station_id, station in stations.items():
        station_prices = prices.get(station_id, {})
        if not station_prices:
            continue

        e10_data = station_prices.get("E10", {})
        b7_data = station_prices.get("B7_STANDARD", {})

        e10_price = e10_data.get("price")
        b7_price = b7_data.get("price")

        result.append(
            {
                "id": station["id"],
                "name": station["name"],
                "postcode": station["postcode"],
                "miles": station["miles"],
                "open_today": station.get("open_today"),
                "e10_price": round(e10_price, 1) if e10_price is not None else None,
                "e10_updated": e10_data.get("timestamp"),
                "b7_price": round(b7_price, 1) if b7_price is not None else None,
                "b7_updated": b7_data.get("timestamp"),
            }
        )

    result.sort(key=lambda x: x["miles"])

    def find_cheapest(fuel_key):
        cheapest = None
        for station in result:
            price = station.get(fuel_key)
            if price is not None:
                if cheapest is None or price < cheapest["price"]:
                    cheapest = {
                        "name": station["name"],
                        "postcode": station["postcode"],
                        "miles": station["miles"],
                        "price": price,
                    }
        return cheapest

    best_e10 = find_cheapest("e10_price")
    best_b7 = find_cheapest("b7_price")

    return {
        "state": len(result),
        "attributes": {
            "icon": "mdi:gas-station",
            "best_e10": best_e10,
            "best_b7": best_b7,
            "stations": result,
            "last_update": datetime.now().isoformat(),
        },
    }


# =============================================================================
# MAIN
# =============================================================================

def main(argv=None):
    argv = argv if argv is not None else sys.argv[1:]
    args = parse_args(argv)

    global CONFIG
    cfg = DEFAULT_CONFIG.copy()

    if args.config:
        cfg = apply_overrides(cfg, load_config_file(args.config))

    # CLI overrides take precedence over config file
    cfg = apply_overrides(
        cfg,
        {
            "client_id": args.client_id,
            "client_secret": args.client_secret,
            "home_lat": args.home_lat,
            "home_lon": args.home_lon,
            "radius_miles": args.radius_miles,
            "token_file": args.token_file,
            "state_file": args.state_file,
        },
    )

    CONFIG = normalise_config(cfg)

    try:
        token = get_token()

        state = load_state()
        cached_prices = state.get("last_prices") or {}
        if not isinstance(cached_prices, dict):
            cached_prices = {}


        # Stations: use cache unless forced refresh or cache doesn't match config
        nearby_stations = None
        if not args.refresh_stations and stations_cache_is_usable(state):
            nearby_stations = state.get("nearby_stations")

        if nearby_stations is None:
            try:
                stations_data = fetch_all_batches(token, "/api/v1/pfs")
                nearby_stations = process_stations(stations_data)

                # Persist cached nearby stations + the config signature they were built with
                state["nearby_stations"] = nearby_stations
                state["stations_config"] = {
                    "home_lat": float(CONFIG["home_lat"]),
                    "home_lon": float(CONFIG["home_lon"]),
                    "radius_miles": float(CONFIG["radius_miles"]),
                }
                state["stations_cached_at"] = datetime.now().isoformat()
            except Exception as e:
                # If we weren't explicitly forcing a refresh, fall back to whatever cache we have
                cached = state.get("nearby_stations")
                if not args.refresh_stations and isinstance(cached, dict) and cached:
                    nearby_stations = cached
                else:
                    raise

        nearby_ids = set(nearby_stations.keys())

        # Prices: incremental if we have previous timestamp
        last_price_timestamp = state.get("last_price_timestamp")
        params = {}
        if last_price_timestamp:
            dt = datetime.fromisoformat(last_price_timestamp.replace("Z", "+00:00"))
            dt = dt - timedelta(minutes=30)
            params["effective-start-timestamp"] = dt.strftime("%Y-%m-%d %H:%M:%S")

        prices_data = fetch_all_batches(token, "/api/v1/pfs/fuel-prices", params)
        prices, max_timestamp = process_prices(prices_data, nearby_ids)
        # prices is what came back this run (may be empty!)
        for node_id, fuels in prices.items():
            if not isinstance(fuels, dict):
                continue
            existing = cached_prices.get(node_id)
            if not isinstance(existing, dict):
                existing = {}
            existing.update(fuels)          # overwrite only fuels we got updates for
            cached_prices[node_id] = existing

        # Persist merged cache
        state["last_prices"] = cached_prices

        output = build_output(nearby_stations, cached_prices)

        if max_timestamp:
            state["last_price_timestamp"] = max_timestamp

        save_state(state)

        print(json.dumps(output))

    except Exception as e:
        import traceback
        print(
            json.dumps(
                {
                    "state": "error",
                    "attributes": {"error": str(e), "traceback": traceback.format_exc()},
                }
            ),
            file=sys.stderr,
        )
        sys.exit(1)


if __name__ == "__main__":
    main()
3 Likes

im super excited, waiting for an integration

There is one already:

I made one without knowing that one existed and mine uses the API as above.

1 Like

Cool, @Holdestmade was just looking for an integration after I got @Rozak node Red working.

The integration from beecho01 is using the old data feeds that are limited to specific companies rather than the new Fuel Finder API that will include every fuel retailer

Excellent! I did it as a script in python as it just ran in the background and updated a fairly simple command_line sensor - but top job on the integration! Love it!!!

The beecho01 version uses the old data sources so thats why I wrote the python version.

I’ve added a few more pops, bells and whistles to the python script in the meantime - now it will:

  • automatically do full UK station cache management
  • incremental station and price updates
  • auto-fixing of obviously incorrect data submitted by stations
  • auto-refresh data when needed
  • handle multiple sensors
  • multiple fuel type reporting
  • API backoff and delayed pulls
  • query by station ID, station name and brand (supports regex)
  • cache health check
  • debug options
  • unified JSON output structure
  • allow you to force rebuild the cache

You can just setup the query you want to run and the script will manage everything in the background each time it runs - updating whatever it needs to update as it goes. Set and forget style.

I’ve put it up on GitHub if anyone wants to play (it doesn’t need Home Assistant to be useful).

I add this line to configuration.yaml:

command_line: !include command_line.yaml

… then this in command_line.yaml in the /config directory:

# Petrol Prices
- sensor:
    name: "Local Petrol Prices"
    unique_id: local_petrol_prices
    scan_interval: 3600
    command_timeout: 900
    command: "python3 /config/scripts/uk_fuel_finder.py  --lat 52.2345 --lon -1.82645 --radius-miles 8 --fuel-types e10"
    value_template: "{{ value_json.state }}"
    icon: "{{ value_json.details.icon }}"
    state_class: measurement
    json_attributes:
      - mode
      - found
      - errors
      - details
      - stations
      - last_update

And this markdown card to display prices:

  - type: markdown
    title: Local Fuel Prices
    content: >-
      {%- if states('sensor.local_petrol_prices') | int | default(0, true) >= 1
      -%}
        <center>
        <table width="100%">
        {%- for station in
             state_attr('sensor.local_petrol_prices', 'stations')
             | selectattr('e10_price', 'defined')
             | sort(attribute='e10_price')
        -%}
          <tr>
            <td>{{ station.brand.split(' ')[0] | title }}, <font size=2>{{ station.name | title }}</font></td>
            <td><a href="https://waze.com/ul?ll={{ station.lat }}%2C{{ station.lon }}&navigate=yes&zoom=17">{{ station.postcode | upper }}</a></td>
            <td align="right">{{ station.e10_price }}p</td>
          </tr>
          <tr>
            <td colspan="3"><font size=1 color="grey">Open: {{ iif(station.open_today is not none, station.open_today, "NA") }}, Dist: {{ station.distance_miles }} miles, Updated: {{ station.e10_updated | as_timestamp | timestamp_custom('%a %-d %b %Y %H:%M', true, 0)}}</font></td>
          </tr>
        {%- endfor -%}
        </table>
        </center>
        <p><font size=1 color="grey">Updated: {{ state_attr('sensor.local_petrol_prices', 'last_update') | as_timestamp | timestamp_custom('%a %-d %b %Y %H:%M', true, 0)}}</font></p>
      {%- else -%}
        <center>No Fuel Prices Available</center>
      {%- endif -%}
    card_mod:
      style:
        ha-markdown $:
          ha-markdown-element: |
            td {
              border: none !important;
              padding: 0px !important;
            }

… form which you get something like (where the postcodes are active links to Waze to direct you to the station - just for fun):

1 Like

I made one as well, GitHub - mretallack/ukfuelfinder-ha: UK Fuel Finder Home Assistant component

It displays the fuel stations on the map etc… also I created a separate pypi component that can be used in python directly, ukfuelfinder · PyPI

2 Likes

Tee hee excellent - a wealth of choice :slight_smile:

I’ve noticed that the feeds are a little unreliable - some stations are adding prices as 1.319 for example - and the sync between price and station is a little off and need some maintenance depending on this poll window.

Having to store all UK prices (or at least download them every so often - I re-baseline every 30 days in my scripts) is a pain - I really wish they had a lat/lon endpoint!

But generally all seems to be working with it all.

Good job everyone on getting this done quickly :slight_smile:

The integration above from @mretallack seems to have lat lon for the stations, maybe you can look and see how they do it?

oh I’ve got lat/lon in the code already - but the API from the gov doesn’t have that - so you have to maintain state for either the stations you are interested in and filter as you download all the stations - or just dump everything (as you’ve downloaded it anyway) and selectively query.

If you look at the endpoints provided for environment agency API’s they allow geofenced API queries so you don’t waste time downloading useless data.