With artificial intelligence already reshaping education, jobs, and public services, Bangladesh is no longer preparing for the future — it’s living in it. At a policy discussion held on 24 May 2025 at the University of Dhaka, a diverse group of academics, technologists, educators, and youth leaders gathered to ask the urgent question: Is Bangladesh ready for the age of AI?
Organized by the Institute of Informatics and Development (IID), the event focused not just on the promise of AI, but on its risks — and what Bangladesh must do to seize the opportunities while minimizing harm. The dialogue was opened by Sunjida Rahman, Joint Director of IID, and moderated by futurist Shakil Ahmed. The panel included voices from across academia and civil society, including representatives from BRAC University, NSTU, Teach for Bangladesh, Cholpori, and several youth- and education-focused organizations.
Throughout the discussion, participants voiced both hope and caution. On the optimistic side, AI was seen as a potential game-changer — one that could reduce workloads, improve access to public services, and personalize education in ways previously unimaginable. From agriculture to healthcare to education, speakers emphasized the technology’s power to address long-standing inefficiencies and gaps.
But that optimism was tempered by growing concern. One of the most urgent themes was digital exclusion. Many warned that unless deliberate steps are taken, AI could widen existing inequalities — especially for rural populations, women, and people with disabilities. Participants stressed that inclusive AI must go beyond Bengali language integration to also recognize regional accents and dialects, while offering accessible interfaces for people with low digital literacy or those who rely on assistive technologies.
Another pressing concern was the threat of job displacement, especially for labor-intensive and low-income workers. Several speakers feared that companies may prioritize automation over human well-being, leading to greater precarity for those already at the margins. As one group concluded in breakout discussions: “AI can be both healer and killer — nurturing talent or replacing it, depending on how it’s used.”
The conversation turned toward what must change. Many participants called for a national AI policy, alongside new Data Protection and Information Acts, and more investment in capacity-building for youth, women, and frontline workers. The importance of freedom of technology use for women was a key issue, with participants highlighting that even when women have access to smartphones, their usage is often monitored or restricted.
Reliable, affordable internet access also emerged as a foundational demand. Without it, many communities will remain shut out of the digital future altogether. Participants urged the government to expand nationwide coverage and enforce inclusive pricing from service providers.
One central message echoed across the room: Bangladesh cannot afford to remain a passive consumer of global technologies. It must become an active creator — developing AI tools that reflect its languages, values, and social realities. That means involving not just engineers and officials, but also teachers, artists, community leaders, and everyday citizens in shaping a human-centric AI ecosystem.
The event concluded with a strong call to action: Build a rights-based, inclusive AI future that upholds human dignity and social justice. As AI continues to evolve, Bangladesh’s path will depend on what it chooses to prioritize — profit or people, automation or equity, control or collaboration.
“AI isn’t just a tool. It’s a mirror — it reflects the society that builds it,” one speaker noted.
“Let’s make sure what it reflects is dignity, diversity, and care.”
At a national policy discussion held in Dhaka, experts warned that artificial intelligence could either bridge or deepen inequalities—depending on the country’s choices. The event called for inclusive policy, ethical design, and urgent investment in local capacity to ensure AI benefits all.