Discussion
Loading...

#Tag

  • About
  • Code of conduct
  • Privacy
  • About Bonfire
#selfhosting
#selfhosting boosted
poddingue
@poddingue@mastodon.online  ·  activity timestamp 2 days ago

AMD dropped ROCm support for Radeon VII (gfx906) at v6.2.4. I wanted to run local LLMs on it anyway.

Found a community-maintained image on r/LocalLLaMA that packages ROCm 7.1 with llama.cpp for gfx906. One docker pull later, I had llama.cpp + Ollama + Qdrant + Open WebUI running on "unsupported" hardware.

Docker Hub is a library of solved problems.

Full story: https://bit.ly/4pTk3zf

#Docker #DockerCaptain #LocalLLM #AMD #ROCm #OpenSource #SelfHosted #MachineLearning

  • Copy link
  • Flag this post
  • Block
poddingue
@poddingue@mastodon.online  ·  activity timestamp 2 days ago

AMD dropped ROCm support for Radeon VII (gfx906) at v6.2.4. I wanted to run local LLMs on it anyway.

Found a community-maintained image on r/LocalLLaMA that packages ROCm 7.1 with llama.cpp for gfx906. One docker pull later, I had llama.cpp + Ollama + Qdrant + Open WebUI running on "unsupported" hardware.

Docker Hub is a library of solved problems.

Full story: https://bit.ly/4pTk3zf

#Docker #DockerCaptain #LocalLLM #AMD #ROCm #OpenSource #SelfHosted #MachineLearning

  • Copy link
  • Flag this post
  • Block
Log in

Encryptr.net Social

This is a forward thinking server running the Bonfire social media platform.

LGBTQA+ and BPOC friendly.

Encryptr.net Social: About · Code of conduct · Privacy ·
Encryptr.net social · 1.0.0-rc.3.6 no JS en
Automatic federation enabled
  • Explore
  • About
  • Code of Conduct
Home
Login