China
2026.04.05 22:19 GMT+8

Analysis: How dangerous trends in the AI era risk taking us backward

Updated 2026.04.05 22:19 GMT+8
Gong Zhe

/VCG

The current AI boom is often framed as a leap towards ultimate productivity, but if we look past the viral demos, the underlying architecture of the internet may be shifting in a troubling direction: We seem to be moving away from a world of connection and toward a series of cognitive "black holes."

While AI promises to make life easier, it might actually be eroding our digital sovereignty and our very capacity to think for ourselves.

Walled gardens are getting higher

In the early days, the value of the internet lay in its openness – its ability to link any two points of information.

The AI era is reversing this. Major tech corporations are using their massive ecosystems to build "fortresses" that are harder to escape than the walled gardens of the mobile era. These companies don't just own your data; they have the power to integrate services and block competitors so effectively that the internet becomes a collection of isolated islands.

/VCG

This is also why products from pure AI companies like OpenAI are becoming less "magical" than those from Big Tech corporations like Google, which have many must-have services ready to be integrated into AI.

For average users, the convenience of the web is being replaced by proprietary silos. Instead of progress, this feels like a retreat into a fragmented, less accessible digital world.

The 'dumb terminal' in your pocket

A strange paradox is emerging in our hardware. We are buying more powerful phones and laptops than ever, yet we are using them for less and less.

Because the most advanced AI requires massive cloud-based power, we are becoming increasingly dependent on corporate services. For the sake of convenience, we have surrendered our local processing power. Our expensive devices are being relegated to "thin clients" – glorified, high-definition screens that serve as windows into a remote, centralized brain.

This dependency is deeper than anything we have seen with search engines; it is a total outsourcing of local computing power to services we do not own.

The atrophy of human intent

Perhaps the most troubling observation is the void of intent that emerges when the tools become too powerful.

We see this in the recent trend of users flocking to install tools like OpenClaw. Driven by hype, many grant administrator permissions to install this "lobster" on their systems, only to realize they have no idea what to actually ask it to do.

Do you know what you want your AI agents to do before installing them? /VCG

As AI agents become more capable of acting on our behalf, our own muscle memory for original thought and complex desire may be atrophying. We have built a machine that can do anything for a user who, increasingly, wants nothing but more consumption.

Looking for an exit

There is no single fix for this shift, but we can begin by exploring alternatives to total dependence.

This might mean prioritizing smaller, transparent models that can run locally on your own hardware – without needing a giant's permission. It also requires a new legal understanding of "open source" that includes not just code, but transparency in training data and a legal framework that prevents a few companies from monopolizing the building blocks of thought.

The goal should be to treat AI as a tool that enhances our abilities rather than a service that replaces our brains. We need to ensure that in our rush toward a smarter future, we do not accidentally leave our independence behind.

Copyright © 

RELATED STORIES