Yes it is remarkably reliable and just does the job. Except when it breaks.
There was an issue about this time last year when the normal Octopus rate file update did not happen, was very late, or included corrupt data. I call 96 records in the API, being sufficient for two full days at two records per hour. On a couple of occasions I have seen duplicate records in the return, meaning that 96 records were only about 68 real records. There is a filter at the start to remove extra records, which can result in a partial data set.
Given the normal data reliability, and that there is not much that we can do if the data does not appear or is corrupt, there is no error-recovery built in, so if it stops working we can either wait and see if it works tomorrow, or some debugging is required.
First: Is Octopus data OK?
The website https://agileprices.co.uk/ can be used to select your region and tariff, and I can see that data is there for today (Friday) but not yet for tomorrow (Saturday) as expected as we are looking before 16:00.
If no data here - something wrong at the Octopus end of things!
Second: Is the API call returning OK?
Drop a Debug node into the flow, set to ‘complete message’ and connected to the Oct Import node, manually trigger the flow. The http request node should fire, and the output should show in the debug.
We are looking for
- statusCode 200
- payload.count - should be a big number (this is the total count of half-hour tariff records for the particular tariff going back to when it was first introduced)
- payload.results - should be an array of 96 values
If the statusCode is anything other than 200, there is a problem with the API call. If the payload results is not an array of [96] records, then the the call has failed to return the values requested.
Third: Check the base tariff array in context.
In the debug window, in the Context tab, look for the OctAgileTariff variable, and expand to see the array item 95. This should be for the time period 22:30 to 23:00 (BST) for today (or for tomorrow after 16:00 update) and have both import cost and export cost figures.
If the API call is working, but the array is not correctly updating, then it is possible that the Export API call has failed. The code just assumes that both Import and Export calls will be successful, joins #2 returns, and builds the combined array. Should one API fail then the join node count will get out of step, and the table will not build correctly / at all. A redeploy / restart of the flow should reset this.
If all of this is OK, then the ‘problem’ lies further down, possibly with the output of the figures to graph / tables, and will required more debugging.