Since the 1st of October ChatGPT supports automatic prompt caching on most of its concurrent models. This automatically matches prompts to ones in the cache to reduce time and cost. However, the matching is done by looking at the tokens in order, so if the first token does not match any in the cache it cannot use the cached tokens. Also see the docs: prompt caching.
The current prompt starts with the BASE_PROMPT
which is the time. The time changes every second and therefore there is no possibility to use the cache as the start of the prompt changes every time.
I was wondering if the prompt could be rewritten to better support this new prompt caching feature by at least changing the time to be later on in the prompt and ideally by sorting the past states in reverse order of last updated.