微信扫码
添加专属顾问
我要投稿
选择大模型部署工具不再难,本文以DeepSeek-R1 32B模型为例,详解Ollama和llama.cpp的选型指南。 核心内容: 1. Ollama和llama.cpp作为大模型部署工具的背景和区别 2. Ollama和llama.cpp的技术关系和底层实现 3. 基于DeepSeek-R1 32B模型的Ollama和llama.cpp性能评测与部署实践
FROM ./bartowski/DeepSeek-R1-Distill-Qwen-32B-Q5_K_M.gguf
ollama create my-deepseek-r1-32b-gguf -f .\deepseek-r1-32b.gguf
ollama run my-deepseek-r1-32b-gguf:latest
NAME ID SIZE PROCESSOR UNTILmy-deepseek-r1-32b-gguf:latest ad9f11c41b7a 25 GB 87%/13% CPU/GPU 3 minutes from now
https://github.com/ggml-org/llama.cpp/blob/master/docs/build.md#git-bash-mingw64
build/bin/Release/llama-cli -m "/path/to/DeepSeek-R1-Distill-Qwen-32B-Q5_K_M.gguf" -ngl 100 -c 16384 -t 10 -n -2 -cnv
ggml_vulkan: Device memory allocation of size 1025355776 failed.ggml_vulkan: vk::Device::allocateMemory: ErrorOutOfDeviceMemoryllama_model_load: error loading model: unable to allocate Vulkan0 bufferllama_model_load_from_file_impl: failed to load modelcommon_init_from_params: failed to load model 'D:/llm/Model/bartowski/DeepSeek-R1-Distill-Qwen-32B-Q5_K_M.gguf'main: error: unable to load model
// Given a model and one or more GPU targets, predict how many layers and bytes we can load, and the total size// The GPUs provided must all be the same Libraryfunc EstimateGPULayers(gpus []discover.GpuInfo, f *ggml.GGML, projectors []string, opts api.Options) MemoryEstimate { // Graph size for a partial offload, applies to all GPUs var graphPartialOffload uint64 // Graph size when all layers are offloaded, applies to all GPUs var graphFullOffload uint64 // Final graph offload once we know full or partial var graphOffload uint64 ...53AI,企业落地大模型首选服务商
产品:场景落地咨询+大模型应用平台+行业解决方案
承诺:免费POC验证,效果达标后再合作。零风险落地应用大模型,已交付160+中大型企业
2026-04-09
很多人突然不玩小龙虾而用Hermes Agent了。我替你试了,跟小龙虾到底有啥不同?
2026-04-08
开源模型首超Opus4.6!智谱GLM-5.1登场,14小时后CUDA专家被冲了
2026-04-08
探索Agentic生产力:从“被动问答”到“自主分析”
2026-04-08
GLM-5.1 又是开源 SOTA?直接做个图片改字工具验验真假!
2026-04-08
GLM-5.1 开源:零介入,交付整套的 Linux 桌面系统
2026-04-08
DeepSeek 推出快速模式和专家模式
2026-04-07
vLLM v0.19.0 来了,适配 HuggingFace v5,多模态优化,CPU KV 缓存卸载
2026-04-04
Gemma 4开源!整整一年,谷歌终于想明白了!!!
2026-01-30
2026-01-27
2026-01-12
2026-01-29
2026-01-27
2026-01-21
2026-01-28
2026-01-23
2026-01-26
2026-01-26
2026-04-09
2026-04-01
2026-03-17
2026-03-13
2026-03-02
2026-02-05
2026-01-28
2026-01-26