The recent advancements in artificial intelligence have been made possible by tools accessible only at large tech firms, where tens of thousands of powerful machines and terabytes of data are as plentiful as free granola bars and nap pods. But it seems that that is no longer the case in our age.
A new project aims to demonstrate that this isn’t the case by putting together the code, data, and computing power required to replicate one of the most epic—and potentially useful—AI algorithms created in recent years.
EleutherAI (/iˈluθər eɪ. aɪ/) is a “decentralized grassroots collective of volunteer researchers, engineers, and developers focused on AI alignment, scaling, and open source AI research. Founded in July of 2020, our flagship project is the GPT-Neo family of models designed to replicate those developed by OpenAI as GPT-3. Our Discord server is open and welcomes contributors.”
Eleuther’s creators hope that this new AI will become an open source alternative to OpenAI’s GPT-3 language software, which is “the largest language model ever created and can generate amazing human-like text on demand but won’t bring us closer to true intelligence.”
Eleuther is an open source project to match GPT-3, a powerful language algorithm published in 2020 by the company OpenAI that, when given a text prompt, can often write remarkably coherent articles in English. Eleuther is still a long way from matching GPT-3’s full capabilities, but last year the researchers released GPT-Neo, a new version of their model that is roughly as powerful as GPT-3’s most basic version.
Large AI projects that are open sourced will make AI more accessible and commonly used at a time when it is becoming more embedded in large tech companies. It may also affect attempts to benefit from key AI advances, as well as the possibility of AI resources misbehaving or being misappropriated.
Eleuther is based on papers that illustrate how GPT-3 functions, despite the fact that the code for it has not been written yet, but that will change very soon.
The large amount of computing power needed for any open source AI project is perhaps the most significant challenge. GPT-3’s training necessitated the use of cloud computing services worth several million dollars. Between 2012 and 2018, the computer power needed for cutting-edge AI projects increased by about 300,000 times, according to OpenAI. This is, however, a signal of where the future is headed, and we’re pretty excited about it.
Speaking of GPT-3, here’s an AI-generated algorithm that is already writing marketing copy to beat Google’s search engine ranking algorithm.
Photo via Getty