File tree Expand file tree Collapse file tree 1 file changed +27
-0
lines changed Expand file tree Collapse file tree 1 file changed +27
-0
lines changed Original file line number Diff line number Diff line change 2517
2517
- filename: NousResearch_Hermes-4-14B-Q4_K_M.gguf
2518
2518
sha256: 7ad9be1e446e3da0c149fdf55284c90be666d3e13c6e2581587853f4f9538073
2519
2519
uri: huggingface://bartowski/NousResearch_Hermes-4-14B-GGUF/NousResearch_Hermes-4-14B-Q4_K_M.gguf
2520
+ - !!merge <<: *qwen3
2521
+ name: "minicpm-v-4_5"
2522
+ license: apache-2.0
2523
+ icon: https://avatars.githubusercontent.com/u/89920203
2524
+ urls:
2525
+ - https://huggingface.co/openbmb/MiniCPM-V-4_5-gguf
2526
+ - https://huggingface.co/openbmb/MiniCPM-V-4_5
2527
+ description: |
2528
+ MiniCPM-V 4.5 is the latest and most capable model in the MiniCPM-V series. The model is built on Qwen3-8B and SigLIP2-400M with a total of 8B parameters.
2529
+ tags:
2530
+ - llm
2531
+ - multimodal
2532
+ - gguf
2533
+ - gpu
2534
+ - qwen3
2535
+ - cpu
2536
+ overrides:
2537
+ mmproj: minicpm-v-4_5-mmproj-f16.gguf
2538
+ parameters:
2539
+ model: minicpm-v-4_5-Q4_K_M.gguf
2540
+ files:
2541
+ - filename: minicpm-v-4_5-Q4_K_M.gguf
2542
+ sha256: c1c3c33100b15b4caf7319acce4e23c0eb0ce1cbd12f70e8d24f05aa67b7512f
2543
+ uri: huggingface://openbmb/MiniCPM-V-4_5-gguf/ggml-model-Q4_K_M.gguf
2544
+ - filename: minicpm-v-4_5-mmproj-f16.gguf
2545
+ uri: huggingface://openbmb/MiniCPM-V-4_5-gguf/mmproj-model-f16.gguf
2546
+ sha256: 251abb778cf7a23b83774ee6ef34cb3652729a95624e088948f2e8a5a0cd03a1
2520
2547
- &gemma3
2521
2548
url: "github:mudler/LocalAI/gallery/gemma.yaml@master"
2522
2549
name: "gemma-3-27b-it"
You can’t perform that action at this time.
0 commit comments