I'm running my LLMs locally now!
I've slept for too long on open LLMs like Gemma, Llama, Qwen, DeepSeek etc.
They're AMAZING!
Using them is simple (with LM Studio or Ollama) and they're not far behind the "big models" you can access via ChatGPT etc. At least in my experience.
I'm using them for all kinds of tasks...
I'm truly excited about running LLMs locally - that's why I created a course to share my experience & knowledge on using & configuring LM Studio and Ollama.
👉 https://acad.link/local-llms
My Website: https://maximilian-schwarzmueller.com/
Socials:
👉 Twitch: https://www.twitch.tv/maxedapps
👉 X: https://x.com/maxedapps
👉 Udemy: https://www.udemy.com/user/maximilian-schwarzmuller/
👉 LinkedIn: https://www.linkedin.com/in/maximilian-schwarzmueller/
Want to become a web developer or expand your web development knowledge?
I have multiple bestselling online courses on React, Angular, NodeJS, Docker & much more!
👉 https://academind.com/courses