ollama 本地模型部署
- 下载安装: [link](https://ollama.com/download)
- 下载说明
 
- 部署使用
- 在终端查看ollama是否安装完成
- 终端查看ollama 命令说明
- 查看当前支持下载的模型
- 启动对话模式
- 默认情况下,ollama启动了server 的api访问功能
 
- 外部 api访问
- 使用postman网页版本
- fork-ollama的配置:[ollama-postman](https://www.postman.com/postman-student-programs/ollama-api/collection/suc47x8/ollama-rest-api)
 
- 至此,完成本地部署和api访问。 enjoy
下载安装: link
下载说明
支持windows 、 linux 、 macos 多个系统。本文使用windows安装
 下载OllamaSetup.exe 根据指导完成安装。
部署使用
在终端查看ollama是否安装完成
ollama -v
ollama version is 0.3.9
终端查看ollama 命令说明
ollama help
Large language model runnerUsage:ollama [flags]ollama [command]Available Commands:serve       Start ollamacreate      Create a model from a Modelfileshow        Show information for a modelrun         Run a modelpull        Pull a model from a registrypush        Push a model to a registrylist        List modelsps          List running modelscp          Copy a modelrm          Remove a modelhelp        Help about any commandFlags:-h, --help      help for ollama-v, --version   Show version informationUse "ollama [command] --help" for more information about a command.
查看当前支持下载的模型
ollama list
NAME            ID              SIZE    MODIFIED
llama3.1:latest f66fc8dc39ea    4.7 GB  4 days ago
qwen2:latest    e0d4e1163c58    4.4 GB  2 months ago
llama3:latest   365c0bd3c000    4.7 GB  2 months ago
启动对话模式
 ollama run llama3.1
>>> who are you
I'm an artificial intelligence model known as Llama. Llama stands for "Large Language Model Meta AI.">>> /help
Available Commands:/set            Set session variables/show           Show model information/load <model>   Load a session or model/save <model>   Save your current session/clear          Clear session context/bye            Exit/?, /help       Help for a command/? shortcuts    Help for keyboard shortcutsUse """ to begin a multi-line message.>>> /show
Available Commands:/show info         Show details for this model/show license      Show model license/show modelfile    Show Modelfile for this model/show parameters   Show parameters for this model/show system       Show system message/show template     Show prompt template>>> /show infoModelarch                    llamaparameters              8.0Bquantization            Q4_0context length          131072embedding length        4096Parametersstop    "<|start_header_id|>"stop    "<|end_header_id|>"stop    "<|eot_id|>"LicenseLLAMA 3.1 COMMUNITY LICENSE AGREEMENTLlama 3.1 Version Release Date: July 23, 2024>>> /show template
{{- if or .System .Tools }}<|start_header_id|>system<|end_header_id|>
{{- if .System }}{{ .System }}
{{- end }}
{{- if .Tools }}Cutting Knowledge Date: December 2023When you receive a tool call response, use the output to format an answer to the orginal user question.You are a helpful assistant with tool calling capabilities.
{{- end }}
{{- end }}<|eot_id|>
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}<|start_header_id|>user<|end_header_id|>
{{- if and $.Tools $last }}Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}. Do not use variables.{{ range $.Tools }}
{{- . }}
{{ end }}
{{- end }}{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>{{ end }}
{{- else if eq .Role "assistant" }}<|start_header_id|>assistant<|end_header_id|>
{{- if .ToolCalls }}{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }}
{{- else }}{{ .Content }}{{ if not $last }}<|eot_id|>{{ end }}
{{- end }}
{{- else if eq .Role "tool" }}<|start_header_id|>ipython<|end_header_id|>{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>{{ end }}
{{- end }}
{{- end }}
>>> what yuo want for me
I don't have personal desires or expectations from you. I exist to assist and provide information to the best of
my abilities.However, if you'd like to engage in conversation or ask questions on a particular topic, I'm here to help!Here are some ways we can interact:1. **Ask me anything**: You can ask me any question on any topic.
2. **Learn something new**: I can provide information on various subjects, from science and history to
entertainment and culture.
3. **Chat about a hobby**: If you have a particular interest or hobby, feel free to discuss it with me.
4. **Get help with a problem**: If you're stuck with a problem or need advice, I'll do my best to assist you.What sounds interesting to you?>>> Send a message (/? for help)
默认情况下,ollama启动了server 的api访问功能
外部 api访问
使用postman网页版本
需要自己下载下代理。
fork-ollama的配置:ollama-postman
