Updating all Ollama Language Models with single command with Docker and Cronjob
By: Date: March 5, 2024 Categories: Docker',Ollama Tags: , , , , , , ,

As a developer, you might need to upgrade multiple AI models in your project to their latest version using the command-line tool Ollama. However, this can be time-consuming and error-prone if you have many models installed locally. In this article, we’ll show you how to automate this process using a condensed approach with awk and xargs.

ollama list | tail -n +2 | awk '{print $1}' | xargs -I {} ollama pull {}

The script pulls each model after skipping the header line from the Ollama list output. The awk-based command extracts the model names and feeds them to Ollama pull.

If you have Ollama running in docker you can use the following:

docker exec ollama ollama list | tail -n +2 | awk '{print $1}' | xargs -I {} docker exec ollama ollama pull {}

Here is a Crontab line you can add to update all models inside a docker container named ollama every day at midnight:

@daily docker exec ollama ollama list | tail -n +2 | awk '{print $1}' | xargs -I {} docker exec ollama ollama pull {}

Conclusion

In conclusion, using a condensed approach with awk and xargs can help you automate the process of upgrading multiple installed Ollama models using command-line tool. This can save time and reduce the potential for error compared to pulling each model individually. Additionally using Crontab you can schedule this process and have it run while your sleeping.

Leave a Reply

Your email address will not be published. Required fields are marked *