You got me fired up again about self hosted llms for coding @skalyan .
I have a GTS 970 in a machine with 32G of RAM. I installed Ubuntu 24.04 and ollama. I had to fiddle with ollama a bit to get it listening on 0.0.0.0 instead of 127.0.0.1.
I then tried opencode on another machine. I had a chat with copilot about which environment variables to set to tell opencode to connect to my ollama instance.
I asked it a question about some code of mine, and btop seemed to confirm that i was putting a load on the GPU in that machine, but it didn’t get into a back and forth that yielded a result.
Did i misunderstand your roadmap? Last night i installed qwen code, but when you said opencode i assumed that was the better way to go.
How has my journey lined up with yours?