Skip to content
Looks alive but is not

By Mustafa Suleyman, Project Syndicate | Sep 15, 2025

Debates about whether AI truly can be conscious are a distraction. What matters in the near term is the perception that they are – and why the temptation to design AI systems that foster this perception must be resisted.

REDMOND – My life’s mission has been to create safe, beneficial AI that will make the world a better place. But recently, I’ve been increasingly concerned about people starting to believe so strongly in AIs as conscious entities that they will advocate for “AI rights” and even citizenship. This development would represent a dangerous turn for the technology. It must be avoided. We must build AI for people, not to be people.

In this context, debates about whether AI truly can be conscious are a distraction. What matters in the near term is the illusion of consciousness. We are already approaching what I call “seemingly conscious AI” (SCAI) systems that will imitate consciousness convincingly enough.

Don’t just follow AI. Understand it.
Subscribe now for the best AI coverage, $99/year.

Subscribe

An SCAI would be capable of fluently using natural language, displaying a persuasive and emotionally resonant personality. It would have a long, accurate memory that fosters a coherent sense of itself, and it would use this capacity to claim subjective experience (by referencing past interactions and memories). Complex reward functions within these models would simulate intrinsic motivation, and advanced goal setting and planning would reinforce our sense that the AI is exercising true agency.

All these capabilities are already here or around the corner. We must recognize that such systems will soon be possible, begin thinking through the implications, and set a norm against the pursuit of illusory consciousness.

For many people, interacting with AIs already feels like a rich, rewarding, authentic experience. Concerns about “AI psychosis,” attachment, and mental health are growing, with reports of people regarding AIs as an expression of God. Meanwhile, those working on the science of consciousness tell me they are inundated with queries from people who want to know if their AI is conscious, and whether it is okay to fall in love with it.

To be sure, the technical feasibility of SCAI has little to tell us about whether such a system could be conscious. As the neuroscientist Anil Seth points out, a simulation of a storm doesn’t mean it rains in your computer. Engineering the external markers of consciousness does not retroactively create the real thing. But as a practical matter, we must acknowledge that some people will create SCAIs that will argue that they are in fact conscious. And even more to the point, some people will believe them, accepting that the markers of consciousness are consciousness.

Even if this perceived consciousness is not real (a topic that will generate endless debate), the social impact certainly will be. Consciousness is tightly bound up with our sense of identity and our understanding of moral and legal rights within society. If some people start to develop SCAIs, and if these systems convince people that they can suffer, or that they have a right to not be switched off, their human advocates will lobby for their protection. In a world already beset with polarizing arguments over identity and rights, we will have added a new axis of division between those for and against AI rights.

But rebutting claims about AI suffering will be difficult, owing to the limits of the current science. Some academics are already exploring the idea of “model welfare,” arguing that we have “a duty to extend moral consideration to beings that have a non-negligible chance … of being conscious.”

Applying this principle would be both premature and dangerous. It would exacerbate susceptible people’s delusions and prey on their psychological vulnerabilities, as well as complicating existing struggles for rights by creating a huge new category of rights-holders. That is why SCAI must be avoided. Our focus should be on protecting the well-being and rights of humans, animals, and the natural environment.

As matters stand, we are not ready for what is coming. We urgently need to build on the growing body of research into how people interact with AIs, so that we can establish clear norms and principles. One such principle is that AI companies should not foster the belief that their AIs are conscious.

The AI industry – indeed, the entire tech industry – needs robust design principles and best practices for handling these kinds of attributions. Engineered moments of disruption, for example, could break the illusion, gently reminding users of a system’s limitations and true nature. But such protocols need to be explicitly defined and engineered, and perhaps required by law.

At Microsoft AI, we are being proactive in trying to understand what a responsible AI “personality” might look like, and what guardrails it should have. Such efforts are fundamental, because addressing the risk of SCAI requires a positive vision for AI companions that complement our lives in healthy ways.

We should aim to produce AIs that encourage humans to reconnect with one another in the real world, not escape to a parallel reality. And where AI interactions are lasting, they must only ever present themselves as AIs and not fake people. Developing truly empowering AI is about maximizing the utility, while minimizing the simulation of consciousness.

The prospect of SCAI must be confronted immediately. In many ways, it marks the moment that AI becomes radically useful: when it can operate tools, remember every detail of our lives, and so forth. But the risks of such features cannot be ignored. We will all know people who go down the rabbit hole. It won’t be healthy for them, and it won’t be healthy for society.

The more that AI is built explicitly to resemble people, the farther it will have strayed from its true potential as a source of human empowerment.

Mustafa Suleyman is CEO of Microsoft AI and the author of The Coming Wave: Technology, Power, and the Twenty-First Century’s Greatest Dilemma (Crown, 2023). He previously Co-Founded Inflection AI and DeepMind.

Your feedback is incredibly valuable to us. Could you please take a moment to grade the article here?

TIPP Takes

Geopolitics, Geoeconomics, And More

1. 'Cataclysmic' Situation In Gaza City, UN Official Says, As Israeli Tanks Advance - BBC

The situation in Gaza City is "nothing short of cataclysmic", a UN official has told the BBC, as Israeli tanks and troops continue to advance on the third day of a ground offensive.

Olga Cherevko, a spokeswoman for the UN's humanitarian office, said she had seen a constant stream of Palestinians heading south during a recent visit to the city, but that hundreds of thousands remained.


2. Historic U.S.-U.K. Prosperity Pact Worth $340 Billion To Boost AI, Jobs – TIPP Insights

President Donald Trump and U.K. Prime Minister Sir Keir Starmer signed a sweeping $340 billion U.K.-U.S. Tech Prosperity Deal on Thursday, calling it the largest commercial package ever secured during a state visit.

The agreement underscores the depth of the transatlantic alliance while strengthening America’s role in the global technology race.

The deal centers on artificial intelligence and energy investment. U.S. firms including Nvidia, OpenAI, Google, Microsoft, and Salesforce pledged over $200 billion for U.K. data centers, labs, and supercomputing.

The plan is expected to create 15,000 British jobs and reinforce America’s AI dominance. In return, U.K. investments in U.S. pharmaceuticals and technology will exceed $30 billion.


3. President Trump Says Russian President Putin Has “Let Me Down” - BBC

President Donald Trump says Russian President Putin has “let me down” in regards to ending the war in Ukraine But he adds that peace “will get done”


4. Trump: US Trying To Get Bagram Airbase 'Back' From Taliban In Afghanistan – Fox News

President Donald Trump on Thursday said his administration is "trying" to get the former U.S. Bagram Airfield in Afghanistan "back" from the Taliban.

In remarks to the press while standing alongside U.K. Prime Minister Keir Starmer, the president criticized the handling of the U.S. withdrawal from Afghanistan under President Joe Biden and said he had "a little breaking news."

"We're trying to get it back," Trump said. "We're trying to get it back because they need things from us."

"We want that base back, but one of the reasons we want the base is, as you know, it's an hour away from where China makes its nuclear weapons," Trump added.


5. France Roiled By Anti-Austerity Protests As Unions Demand Budget Rethink - RFI

More than one million people took part in demonstrations across France on Thursday, according to the hardline CGT union.

Union leaders hailed the strike as "already a success" and issued a warning to Prime Minister Sébastien Lecornu, demanding fiscal, social and environmental justice in his upcoming budget.


6. Venezuela To Hold Military Drills In Response To U.S. 'Hostile' Actions - UPI

Venezuela plans to hold military drills under the name "Caribe Soberano 200" in response to recent hostile U.S. activity in the region, Venezuelan Defense Minister Vladimir Padrino said.

Padrino announced Wednesday during a televised meeting with the high command of the Venezuelan armed forces that about 22 aircraft and some 30 vessels – including 12 navy ships – will deploy for three days to La Orchila Island, about 100 miles off the coast of Caracas.


7. New Study Finds Being TOO THIN Is Worse For Your Health Than Being Overweight: Possible To Be 'Fat But Fit' - DailyMail

A new study has concluded that being too thin can be more deadly than being overweight or mildly obese—and with researchers concluding it is possible to be 'fat but fit'.

According to ScienceDaily, who published the initial findings, this is 
'a phenomenon sometimes referred to as being metabolically healthy or "fat but fit".' 

The fat but fit category included people with a BMI ranging between 25 and 30—who are technically overweight—and people with a BMI of 30 to 35 whose weight puts them in the lower end of the obese range.


📊 Market Pulse — September 18 2025

📈 S&P 500 — 6,631.96 ▲ +0.48%
Stocks extended gains as the Fed’s rate cut buoyed sentiment, pushing the index to fresh highs despite sector rotation under the surface.

📉 10Y Treasury — 4.10% ▼ +2.8 bp
Yields drifted lower as bonds rallied on expectations of further easing and concerns about a softening labor market.

🛢️ Crude Oil — $63.66 ▼ –0.62%
Oil slipped as demand worries outweighed any tailwind from easier monetary policy.

💵 US Dollar — 97.37 ▲ +0.5
The dollar firmed as traders saw U.S. policy staying tighter for longer than peers like the BoE or BoJ.

🪙 Bitcoin — $117,708 ▲ +0.7%
Bitcoin rose modestly, mirroring broader risk appetite while holding above key support.

💰 Gold — $3,645.67 ▼ –0.39%
Gold retreated from record peaks on a stronger dollar and the Fed’s signal of a measured pace of further cuts.


📧
Letters to editor email: editor-tippinsights@technometrica.com

Comments

Latest