Want to build your own chatbot for $100? A glimpse into AI’s small, cheap, DIY future ...Middle East

News by : (Fortune) -

Andrej Karpathy, a former OpenAI researcher and Tesla’s former director of AI, calls his latest project the “best ChatGPT $100 can buy.” 

Called “nanochat,” the open-source project, released yesterday for his AI education startup EurekaAI, shows how anyone with a single GPU server and about $100 can build their own mini-ChatGPT that can answer simple questions and write stories and poems. 

Karpathy, who called nanochat a “micro model,” wrote on X that models like his should be thought of as “very young children” that “don’t have the raw intelligence of their larger cousins.” Scale up your spending to $1,000, however, and such a model “quickly becomes a lot more coherent and can solve simple math/code problems and take multiple choice tests.” 

The announcement garnered millions of views on X, with the CEO of Shopify, Tobi Lutke, calling it a “gift” to developers, researchers and students. But it’s also an example of what has become a growing trend: smaller, cheaper and more specialized models that have fewer parameters, or the “knobs” inside a model that get fine-tuned during training to help it make sense of language, images, or data. Massive large language models (LLMs) may have trillions of parameters, requiring access to GPUs in the cloud and enormous computational power, while the latest small models may have just a few billion parameters. 

With fewer parameters, these small models don’t try to match the power of frontier models like GPT-5, Claude, and Gemini. But they are good enough for specific tasks, affordable to train, lightweight enough to use on devices like phones and laptops, and easy for startups, researchers, and hobbyists to build and deploy. 

The small-model approach was echoed by researchers at Samsung AI Lab last week, who released a paper showing off their Tiny Recursive Model. It uses a new neural network architecture that shows remarkable efficiency on complex reasoning and puzzle tasks like Sudoku, outperforming popular LLMs while using a minuscule fraction of the computational resources.

There has been a wave of other organizations releasing small AI models, showing that size isn’t everything when it comes to power. Last week, Israel’s AI21 unveiled Jamba Reasoning 3B, a 3-billion-parameter open source model that can “remember” and reason over massive amounts of text, and run at high speed even on consumer devices. In September, UAE’s Mohamed bin Zayed University of Artificial Intelligence (MBZUAI) and G24 recently introduced K2 Think, an open-source reasoning model with only 32 billion parameters that in trials rivaled systems more than 20 times as large. Meanwhile, big tech companies like Google, Microsoft, IBM and OpenAI have all joined the small-but-mighty club, with models that are a fraction of the size of their bigger counterparts. 

Much of this momentum traces back to China’s DeepSeek, whose lean, low-cost models upended industry assumptions at the beginning of this year and kicked off a race to make AI smaller, faster, and smarter. But it’s important to note that these models, while impressive, aren’t designed to match the broad capabilities of frontier systems like GPT-5. Instead, they’re built for narrower, specialized tasks—and often shine in specific use cases. 

For example, this week IBM Research, along with NASA and others, released open-source, “drastically smaller” versions of its Prithvi and TerraMind Earth-observation models that can run on almost any device, from satellites orbiting Earth to the smartphone in your pocket, all while maintaining strong performance. “These models could reshape how we think about doing science in regions far from the lab—whether that’s in the vacuum of space or the savanna,” the company wrote in a blog post.

None of this means the era of massive, trillion-parameter models is coming to an end. As companies like OpenAI, Google and Anthropic push for artificial general intelligence, which requires more reasoning capabilities, those will be the models that push the frontier. But the rise of smaller, cheaper, and more efficient models shows that AI’s future won’t be defined by size alone. 

This story was originally featured on Fortune.com

Hence then, the article about want to build your own chatbot for 100 a glimpse into ai s small cheap diy future was published today ( ) and is available on Fortune ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.

Read More Details
Finally We wish PressBee provided you with enough information of ( Want to build your own chatbot for $100? A glimpse into AI’s small, cheap, DIY future )

Last updated :

Also on site :

Most Viewed News
جديد الاخبار