thirdBreakfast@lemmy.worldtoSelfhosted@lemmy.world•Guide to Self Hosting LLMs Faster/Better than OllamaEnglish
3·
24 hours agoIf it’s an M1, you def can and it will work great. With Ollama.
If it’s an M1, you def can and it will work great. With Ollama.
+1 for Forgejo. I started on Gogs, then gathered that there had been some drama with that and Gitea. Forgejo is FOSS, simple to get going, and comfortable to use if you’re coming from GitHub. It’s actively maintained, and communication with the project is great.
Guide to Self Hosting LLMs with Ollama.
ollama run llama3.2