For LLaVa, we only support self-hosting via the ghcr.io/xfalcox/llava:latest
container image at the moment.
If you do have access to a server with GPU with at least 24GB VRAM, you can self-host it, otherwise I recommend sticking to GPT-4V.
For LLaVa, we only support self-hosting via the ghcr.io/xfalcox/llava:latest
container image at the moment.
If you do have access to a server with GPU with at least 24GB VRAM, you can self-host it, otherwise I recommend sticking to GPT-4V.