NaN@lemmy.worldtoNo Stupid Questions@lemmy.world•Is there any actual standalone AI software?
1·
4 months agoFor LLMs, I’ve had really good results running Llama 3 in the Open Web UI docker container on a Nvidia Titan X (12GB VRAM).
For image generation tho, I agree more VRAM is better, but the algorithms still struggle with large image dimensions, ao you wind up needing to start small and iterarively upscale, which afaik works ok on weaker GPUs, but will gake problems. (I’ve been using the Automatic 1111 mode of the Stable Diffusion Web UI docker project.)
I’m on thumbs so I don’t have the links to the git repos atm, but you basically clone them and run the docker compose files. The readmes are pretty good!
As a longtime Plex user, I also hate their lack of focus and tendancy to priorotize bad features (like paid streaming and VR). But this one feels more like a way to re-focus on video by removing photo code from the main (video) app’s codebase, making it easier to maintain.