What do you think?
Join the conversation
Join the community of Machine Learners and AI enthusiasts.
Sign UpWhat do you think?
Individual users win only if they can get it cheaper, faster, more free as in software freedom, to run LLM models on their own hardware. Otherwise, those mega-stories are of no use.
I agree; currently hosting a custom model is not viable either up front hardware or serves both expensive. Who can host open source models cheaper will get the public attention. As a start up or individual , you can invest in fine tuning and training with high end hardware to make a specialised model get the weight and pipeline perfect but if you want to host it and respond to the scale of the demand or availability of the service it’s difficult not sustainable.