“The only true wisdom is in knowing you know nothing.” — Socrates
Unlearning is a process of peeling away the layers of assumptions, beliefs, and behaviors that no longer serve us. It’s not about rejecting everything but about questioning, reflecting, and rebuilding our understanding. Socrates reminds us that true wisdom begins when we admit we don’t have all the answers. This mindset — embracing discomfort and letting go of outdated ideas — is not only personal but deeply tied to the ethical responsibility we have as creators, innovators, and thinkers.
You can turn a switch on or off by only thinking about it — or even more exciting, fly a drone with the power of your mind. This isn’t magic but the result of technological advancements happening around us. At first glance, this seems incredible and reminds me of the lecture I gave two years ago, “From Sci-Fi to XR,” where I explored how science fiction has influenced technologies we once thought impossible, like carrying a supercomputer in our pocket.
While some innovations remain experimental today, the fast pace of development and the integration of AI are rapidly transforming impossibilities into realities. For instance, David Baker, PhD, from the University of Washington School of Medicine, and Timothy Patrick Jenkins, PhD, from the Technical University of Denmark, used deep learning tools to design proteins capable of neutralizing toxins in cobra venom. This is the kind of advancement we celebrate. But innovation comes with responsibility — responsibility to reflect on the “why” and “who” behind what we create.
Unlearning isn’t mechanical — it’s deeply human. It’s tied to emotions, memories, and attachments. As creators, we often need to unlearn outdated assumptions to build better solutions. For example, growing up in two cultures — my parents representing Iranian traditions and my education rooted in the American system — often left me in an uncanny state of mind. I was caught between opposing values, constantly questioning which norms to follow.
Over time, I began to unlearn parts of what I had been taught by both cultures. This wasn’t about rejecting either side but about reconciling them to align with my personal ethical standards. It wasn’t a simple or painless process; it involved guilt, self-doubt, and uncomfortable introspection. But that discomfort is where growth happens. The same applies to the systems and tools we design today. When we unlearn the idea that “innovation for its own sake” is inherently good, we open the door to building solutions that are truly ethical and sustainable.
Consider the idea of widespread access. At first glance, it seems like a universal good. But if potent painkillers were distributed without any oversight, the consequences would be disastrous. Similarly, the unchecked release of generative AI tools has already reshaped education, safety, and social well-being — for better and worse. On one hand, breakthroughs in AI enable medical advancements, such as designing cozy, guided physical therapy apps. On the other hand, the same technology has been used for deep fakes, scams, and manipulation.
This is where unlearning comes in. We must unlearn the assumption that “access equals good” and instead focus on ethical boundaries. Innovation without reflection leads to chaos.
Surveillance is one of the most pressing issues tied to unlearning. As I dive deeper into frameworks that collect personal data — tracking motion, eye movements, heart rate, and neuro activities — I’m reminded that we’ve been conditioned to see this as “progress.” But perhaps it’s time to unlearn that mindset. Why are these tools being built? Who do they benefit? And what are the costs?
Nita Farahany, in her book “The Battle for Your Brain,” highlights the risks neurotechnology poses to cognitive freedom. She cites the TikTok ban in the U.S., upheld by the Supreme Court, as a critical example of how platforms like TikTok can serve as vehicles for mass surveillance and content manipulation. While the decision safeguards national security, it raises broader questions about how we’ve come to accept intrusive technology in our lives.
Unlearning here means challenging the narrative that every form of innovation is a step forward. It’s about asking deeper questions to ensure that what we build aligns with the values of privacy, freedom, and dignity.
On a larger scale, unlearning applies to how we view technology’s environmental footprint. Take data centers, for example. While they support global connectivity, they consume massive amounts of energy and water, leaving behind an undeniable environmental cost. Tesla’s Gigafactories, celebrated for advancing renewable energy solutions, rely on lithium-ion battery production, which raises serious sustainability concerns.
To move forward ethically, we must unlearn the idea that “progress” is inherently good. Progress must be sustainable — aligned with the needs of the planet and future generations. The article “Global Data Center Energy Demand and Strategies to Conserve Energy” highlights the rising global energy demands of data centers driven by the exponential growth in digital services. While energy efficiency improvements have helped stabilize overall consumption, the increasing demand for data storage and processing remains a challenge.
Unlearning doesn’t mean rejecting everything. It means letting go of what no longer serves us to make room for something better. It’s about discovery — discovery of new ways to think, create, and act. As Socrates said, “Education is the kindling of a flame, not the filling of a vessel.” Unlearning ignites that flame, guiding us toward growth and transformation.
Embrace your being!