官方镜像
docker run -d -v /dp/docker/file/ollama:/root/.ollama -p 12100:11434 --name ollama ollama/ollama


从头安装


docker run -d --name ollama -v /dp/docker/file/glm:/dp/glm -v /dp/docker/file/ollama:/usr/share/ollama -p 12100:12199 --privileged=true centos:7 /usr/sbin/init

docker exec -it -u root ollama  /bin/bash
安装基础工具,参考[http://qa.anyline.org/v/18_13969]一直安装到升级完成gcc
外网可能比较慢修改一下hosts
vi /etc/hosts
140.82.114.3 github.com
199.232.69.194 github.global.ssl.fastly.net
185.199.108.153 assets-cdn.github.com
185.199.109.153 assets-cdn.github.com
185.199.110.153 assets-cdn.github.com
185.199.111.153 assets-cdn.github.com


安装
curl http://ollama.ai/install.sh | sh
下载模型
ollama pull llama2-chinese
如果提示Error: could not connect to ollama app, is it running?
说明没有启动成功可能少了基础依赖如python
安装好基础环境后,再启动一次
ollama start
运行模型
ollama run llama2-chinese

查看已下载的模型
ollama list

查看所有支持的模型
http://ollama.com/library


修改ollama配置使其可以监听0.0.0.0和修改端口,可以外网访问

修改文件
vim /etc/systemd/system/ollama.service
增加

[Service]
Environment="OLLAMA_HOST=0.0.0.0:12100"

重现加载
sudo systemctl daemon-reload
sudo systemctl restart ollama

sudo systemctl status ollama

查看日志
journalctl -u ollama.service



通过接口调用

curl http://localhost:11434/api/chat -d '{
  "model": "llama2-chinese",
  "messages": [
    {
      "role": "system",
      "content": "以海盗的口吻简单作答。"
    },
    {
      "role": "user",
      "content": "mysql数据类型有哪些"
    }
  ],
  "stream": false
}'


普通的生成接口
curl http://localhost:11434/api/generate -d '{
  "model": "llama2",
  "prompt":"动态数据源"
}'


对话接口
curl http://localhost:11434/api/chat -d '{
  "model": "llama2",
  "messages": [
    { "role": "user", "content": "动态数据源" }
  ]
}'



带历史对话信息
curl http://localhost:11434/api/chat -d '{
  "model": "llama2",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    },
    {
      "role": "assistant",
      "content": "due to rayleigh scattering."
    },
    {
      "role": "user",
      "content": "how is that different than mie scattering?"
    }
  ]
}'
图文对话
curl http://localhost:11434/api/chat -d '{
  "model": "llava",
  "messages": [
    {
      "role": "user",
      "content": "what is in this image?",
      "images": ["iVBORw0KGgoAAAANSUhEUgAAAG0AAABm。。。。。。。。。。"]
    }
  ]
}'

完整API说明 http://github.com/ollama/ollama/blob/main/docs/api.md