I am new in the field of Ollama. I am using llama3 model and I am looking to fine tune the model using huggingface to get the proper response. I am curious to know how to do that and will my data get stored in the hugging face server if I use?
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Ollama + Llama-3.2-11b-vision-uncensored like 22 | 1 | 1492 | December 10, 2024 | |
| How to run huggingface model on base url | 0 | 175 | January 17, 2025 | |
| Deploy model in hugging face platform | 0 | 275 | December 18, 2023 | |
| How to download a model and run it with Ollama locally? | 17 | 123227 | May 15, 2025 | |
| Is there llama3 api for hugging face to use? | 4 | 1019 | September 8, 2024 |