Closed source AI models are not the future(we should want)
The unequal distribution of power/access
Having all the best models and neural networks centralized at a single location not only brings forth the problem of completely unbalanced accessibility to the data and full capabilities of these models in the context of large corporations compared to the general public. This isn't just another regurgitation of the quote "knowledge is power" , in the theoretical confines of the future of AI "To be able to know is power". We are all having fun with how useful AI is now but looking into the future a great utility is often followed by a great reliance on said utility. Just look at the internet today. Do you really want to have to rely on these large corporations? If you don't I'd start looking at investing into downloading customizing local language models and managers. And if you're already a data scientist or AI researcher maybe look into decentralized models or be be the pioneer of the next big one. I don't know Ideas have power.
Local Language Model Managers
GPT4ALL - A free-to-use, locally running, privacy-aware chatbot
LLM - utility and Python library for interacting with Large Language Models
Ollama - allows you to run,customize, download large language models
Local.ai - A desktop app for local, private, secured AI experimentation.
Large Language Models
Idea for the future
Distributed language models based on a torrent like architecture. Using some form of quality control for a given subset of contributors or gateways to process contributions. A model like this used and implemented by a large community could very well compete with the conglomerate of data being pushed into big tech right now as well as remove censorship and restrictions.