The recent G20 summit in Johannesburg focused attention on the rising costs of artificial intelligence and the broader challenge of building a trusted and inclusive digital economy.
In response, South African-born AI company Lelapa AI says it is implementing an efficiency-first approach, shaped by the realities of limited computing power and complex multilingual environments.
AI systems are becoming increasingly expensive to build and maintain, with large-scale models requiring massive computing power, large amounts of energy use, extensive datasets, and large capital investments. These pressures are widening the chasm between countries with rich digital infrastructure and those working to expand affordable access, strengthen data governance, and build resilient innovation ecosystems.
The G20 Digital Agenda highlights interrelated challenges such as securely managing cross-border data flows, ensuring reliable AI development, supporting MSMEs in the digital economy, strengthening national capacity, and closing gaps in usage and coverage that limit digital participation.
Lelapa AI’s approach provides a technically reliable path to directly address these concerns by reducing infrastructure demands, lowering costs, and enabling broader language coverage without compromising performance, the company says. Big tech currently pays nearly 90% of the cost of providing models, and that burden is neither sustainable nor scalable for the world.
“The G20 has put sustainability and equity at the heart of its digital agenda, and efficiency is essential to both,” said Professor Vukosi Maribate, co-founder of Lelapa AI and knowledge partner of the G20 AI Taskforce. “Our contributions at Lelapa AI highlight the need for low-resource approaches for global AI that can serve diverse economic and linguistic realities.”
Lelapa AI provided the following information about its efficiency-first approach.
Lelapa AI builds language technology designed to work in complex, multilingual, and data-scarce environments. By designing with constraints in mind from the beginning, the company develops models that are affordable, high-performance, and scalable across regions. Their approach benefits the Global South and provides a blueprint for high-resource markets facing rising costs of inference and increasing pressure to reduce environmental impact.
Lelapa AI’s work includes:
Vulavula is a real-world transcription and translation engine built to work accurately in multilingual code-switched environments. Efficiency-first design reduces service costs, making high-quality linguistic intelligence more economically available to governments, businesses, and public institutions. InkubaLM is Africa’s first multilingual small language model designed for low resource and code switching environments. Through the Buzuzu Mavi Challenge, the model was significantly scaled down by 75% without compromising performance, proving that efficiency and functionality can go hand in hand. The Esethu Framework is a pioneering sustainable data governance model that focuses the community on how to build and maintain low-resource language datasets. This license ensures that foreign users of African language data contribute to future dataset development, creating a self-sustaining ecosystem and economically sound pathway to support underserved languages. This approach provides a blueprint that can be adapted and applied globally. ViXSD is the first dataset created under the Esethu framework. This open source isiXhosa ASR dataset contains 10 hours of high-quality audio across dialects, ages, and regions. Community-driven development ensures reliability, and the Esethu license ensures continued reinvestment in new datasets and improved capital outcomes. Extensive research and applied expertise. Lelapa AI brings together rigorous academic research and real-world product development. This is an unusual combination in an industry where companies often specialize in one or the other. The company is a world leader in efficient model design, sustainable dataset creation, and scalable linguistic AI.
Together, these innovations have proven that scalability and performance can be achieved without the need to over-build systems or over-compute.
Pelonomi Moiloa, CEO of Lelapa AI, said: “The future of AI lies in efficient design. Scarcity drives sharper thinking, and we find that scalable and sustainable AI doesn’t require endless compute or heavy infrastructure. It starts with smarter foundations that serve real people and real contexts.”
Lelapa AI’s work lies at the intersection of real-world challenges faced by institutions that need to do more with limited resources. Language remains one of the biggest barriers to access in government offices, clinics, classrooms, and community centers.
By designing AI systems that thrive in low data and low compute environments, Lelapa AI supports the public sector in need of reliable tools without significant infrastructure costs. This approach provides schools with the language-enabled tools they need to enhance digital public services, improve multilingual health communication, and reach learners across diverse language contexts.
Across the economy, resource-efficient AI creates room for participation that is often not possible with higher-cost systems. MSMEs can engage with customers across languages, frontline workers can accurately document interactions, and public service teams can clearly communicate with communities across dozens of language realities.
Africa’s contribution to the world
Lelapa AI positions Africa’s resource-conscious innovation as a model for the world. Lelapa’s work centers affordability, efficiency, and language diversity, providing stakeholders with a practical path to building an affordable, inclusive, and sustainable digital ecosystem. This perspective suggests a future where AI enhances society without overburdening the systems that serve it.


