GPT-4 or 3.5 can not be self hosted.
Some LLMs are open source such as Falcon or various LLaMA based models (which come with licensing questions) can be self hosted but to date they all underperform GPT 4 or even 3.5
Your back of the napkin calculation there is wildly off, if you are going to be self hosting an LLM you probably want an A100 or H100, maybe a few of them … try googling for prices…