A speech by Ibtihal Aboussaad delivered at the Muslim Council of Britain
Brothers and sisters, respected Imams and community leaders,
Thank you once again for gathering here today to engage on this vital and timely topic. Alhamdulillah, after such a fruitful day, it is clear that we live amid unprecedented and rapid technological change. Innovations once confined to distant laboratories or the realms of science fiction—like artificial intelligence—are now deeply interwoven into our daily lives. AI shapes education, law, medicine, activism, and even how we interact with each other and practice our Deen.
As Muslims, entrusted with truth and guidance, we cannot afford to ignore these developments. There is no need to fear these tools, but we must understand them: how they are used, whom they serve, and what it means to live faithfully in an age where algorithms dictate what we see, what we value, and sometimes even what counts as truth.
History reminds us of similar crossroads. When social media emerged explosively in the early 2010s, many Muslims and Muslim institutions were unprepared. Platforms like Facebook, Instagram, and YouTube began shaping how Muslim youth think about themselves, their identities, and Islam itself. Western companies controlled this narrative, and today, we still find ourselves playing catch-up. Algorithms crafted by others often define what “true Islam” is, while our scholars face censorship, activists are restricted, and young people struggle with confusion or hypervisibility—often without realizing how digital structures shape our real-world experiences.
When advanced surveillance tools arrived, Muslims were frequently their first targets. For example, the UK’s Prevent strategy used predictive tech to label young Muslims as pre-criminals based solely on beliefs and internet activity. Similarly, in Canada, mosques were surveilled, and facial recognition was disproportionately tested in immigrant neighborhoods.
Now AI intensifies these dangers. Some models assign risk based purely on names or locations—a person might be deemed a threat to British society simply because they bear the name of our Prophet (Sallallahu ‘Alayhi wa Sallam) and live near Al-Haram Al-Sharif. In many places, we witness a harrowing new reality: AI developed by tech giants like Microsoft, Google, and Amazon helps generate bombing target lists sent via cellphones. This is no longer abstract ethics—it is genocide accelerated by AI.
Technology itself is not inherently evil. The problem is that we were not there first—to understand it and guide its deployment. This may sound grim, but it is not a call to despair: it is a call to action.
What we see should shake us deeply—not only because of the immense suffering of our Ummah, but because this reveals the terrifying future of warfare, surveillance, and control in the AI era. If we stand passively by while others shape AI, then the oppressive systems tested on different people today will be exported globally tomorrow. We are witnessing AI-powered oppression in beta, with people as unwilling test subjects.
These AI systems don’t only target; they justify violence. When a model flags a building as containing “threatening infrastructure,” that alone becomes enough justification to bomb it. Machines become shields for war crimes. If we remain ignorant of AI, we risk accepting such justifications blindly in the future.
This is why we cannot afford to be sideline observers. Every day we delay, every day we choose ignorance over engagement, we cede control over these systems—without Muslim voices, without our values, without resistance.
The oppressed are now imprisoned not just physically, but algorithmically. They are crushed not only by bombs, but by biased models that see existence itself as a threat.
What, then, is our response?
Are we building the protections they need? Are we equipping our youth and communities to understand and challenge these models? Are we creating just alternatives, or passively hoping others will handle the technical details while we focus on “more spiritual” matters?
Islam knows no divide between the spiritual and the technological. When Allah (Subhana wa Ta’ala) entrusted us with stewardship of this earth, no domains were exempted. The command to enjoin good and forbid evil applies universally—not just in masjids and madrasas.
The Prophet (Sallallahu ‘Alayhi wa Sallam) said: “Whoever sees a wrong action, let him change it with his hand. If not, then with his tongue. If not, then with his heart—and that is the weakest of faith.” Today, these wrongs are automated. To change them with our hands, we must understand who builds them. To change them with our tongues, we must learn the language of their creators.
We cannot afford to repeat history. This technological revolution outpaces all before it. It touches every sphere—law, education, healthcare, theology, security.
If we sleep through this moment again, we will wake to react to damage without preparation—when instead we could be shaping AI to serve our communities.
The good news? It is not too late.
We can prepare, learn, build, and influence decision-making at all levels.
What gives me hope are not only brilliant Muslim minds and resources but also deep moral clarity grounded in submission to Allah (Subhana wa Ta’ala) and centuries of history.
When the Mongols destroyed Baghdad, or Crusaders occupied Al-Quds, or colonial powers carved up the Muslim world, Muslims did not despair or retreat. They adapted, rebuilt, and thrived anew.
Today, faced with AI that accelerates injustice, we face a similar choice: despair or build.
We inherit a tradition that has engaged every tool of its time for truth and justice—from scholars preserving knowledge, architects shaping cities, to merchants upholding ethical trade.
When early Muslims met Greek philosophy and Persian governance, they engaged and improved—not rejected.
Today, we face AI. Our response must be the same: engagement, mastery, and adaptation to uphold our values.
Let us rise to this moment and be among those shaping the tools and ethical conversations of tomorrow—not only to protect Muslims but to fulfill our trust as witnesses to truth in every age.
This time, we will not be late to the table. We will lead, inshallah.
The children experiencing today’s AI-powered oppression will grow in a world where AI is even more pervasive. What world will we leave them? One where AI serves injustice? Or one shaped by us to serve justice?
The answer is in our hands—whether we choose engagement or avoidance, learning or waiting, leadership or passive following.
May Allah (Subhana wa Ta’ala) guide us to recognize the stakes of our era, to leave legacies that uplift the Deen and our Ummah.
May He protect us from misguidance, empower us with knowledge, and accept our efforts as sincere service.
Ibtihal Aboussad (also transliterated as Ibtihal Aboussaad or Abu Saad) is a Moroccan software engineer, Harvard graduate, and former Microsoft AI engineer who became internationally recognized for her activism on ethical issues in technology. She was fired from Microsoft after publicly protesting and calling out the company’s involvement in providing AI and cloud computing technologies to the Israeli military, which she denounced as enabling oppression and war crimes.
She is a leading organizer of the “No Azure for Apartheid” campaign, advocating against the use of AI for military and surveillance purposes that violate human rights. Her courageous stance has garnered widespread attention in technology ethics circles, activism.
Aboussad holds degrees in Computer Science and Psychology from Harvard University, and her interests focus on the intersection of AI, ethics, and youth empowerment. She was born and raised in Morocco and has previously participated in international STEM exchange programs. YouTube Mac convention
Her movement encourages tech professionals and the wider public to examine the ethical responsibilities of technology companies critically and to advocate for the use of technology in ways that respect human rights and justice.
Ibtihal gave a speech on “AI and the Future of Muslim Communities,” organized by the Muslim Council of Britain.
* The developer and the ICS do not endorse nor discourage what’s in this speech.