Control num_ctx in ollama integration

My smart home as many entities and when I tried to ask llama3:8b some questions via the ollama integration the responses were really disappointing. But at some point I realized that the default ollama num_ctx is 2k and by changing that to 8 I was starting to get sensible responses when running my prompts manually. Creating a new modelfile with num_ctx set has solved the problem for me now but there is also a way to set num_ctx via the ollama API (see ollama/docs/ at 105186aa179c7ccbac03d6719ab1c58ab87d6477 · ollama/ollama · GitHub) and that would help experimenting with other models.