What is the most valuable commodity in the world today? It’s not oil, computing power for A.I. or Bitcoin. These are inert objects, meaningless without human attention to give them value. Attention, the most potent and sought-after resource, drives everything from demand for rare minerals to international trade policies. When you capture attention with something as simple as a well-placed iPhone ad, you convert a few flipped bits in a database into a cascade of physical labor, resource extraction and influence on geopolitical affairs. Mastery of the physical world is trivial once you can harness attention at scale.
The boundary between our minds and the attention economy is vanishing
Corporations, billionaires and (soon) artificial agents are racing to capture and direct human attention to serve their own ends. Until 30 years ago, attention was harnessed through inefficient, brute-force methods: broad, non-specific broadcasts via print, radio and television. Despite their inefficiency, these tools succeeded in mobilizing entire populations, organizing societies toward engineering marvels and even enabling the dehumanization required for total war. The first major jump forward for harnessing attention came with the advent of the iPhone and its ecosystem. This was the first bonafide cognitive prosthetic available to the masses. It transformed nearly every human into a “tuned-in cyborg,” perpetually tethered to a global network.
Less than 17 years after the iPhone, the next leap is already here and promises to be more turbulent and transformative than we can imagine. Picture a society where corporations treat your mind as an extension of their data pipelines, mining your attention directly through your neural activity for profit while selling you back the illusion of control. A society where reality is “enhanced” with the infinite possibility of artificially generated worlds, sensations and experiences.
“The sky above the port was the color of television, tuned to a dead channel.”
William Gibson’s Neuromancer, written in 1984, believe it or not, gives an increasingly plausible description of where our society is headed today. Privacy is non-existent, and data is hoarded by mega-corporations as a commodity for sale. Reality becomes a collective “consensual hallucination experienced daily by billions” neurally connected to cyberspace controlled by corporations and built on a patchwork of infrastructure that’s frequently hacked and subverted. Seemingly mundane “just yet another toy for bros with disposal income” technology releases like the Apple (AAPL) Vision Pro and Orion AR inches us closer to this reality. These devices pack innovative hardware that bridges the gap between our intentions, thoughts and direct effects in the digital worlds they create for us.
Want attention? Go for the eyes
The Apple Vision Pro picks up where Google (GOOGL) Glass left off, creating a closed-loop action system that reacts toward our attention by measuring the movements of our eyes. Its suite of internally facing sensors can intuit arousal, cognitive load and generic emotional states based on precise fluctuations in pupil diameter and subtle rapid eye movements. Pupil diameter, for example, serves as a direct proxy for noradrenergic tone, reflecting activity in the sympathetic nervous system and controlled by the neurotransmitter output of the locus coeruleus, a brain structure tied to arousal and attention. While the technology’s applications appear limited for now, it is undeniably impressive that Apple has removed the need for external input devices by leveraging something as intuitive as a user’s gaze to navigate and manipulate digital environments.
You’re being groomed, and so are the groomers
Technological marvels aside, it should be clear that this isn’t coming for free–or without consequences. Devices like the Vision Pro are subtly grooming society for a future where more invasive technologies—such as Neuralink or other brain-computer interfaces—may be used to completely subvert human agency. Current economic incentives do not value privacy, individual agency or digital human rights. Why? Our economy generates greater returns when human behavior is structured and segmented along the most stable and lucrative markets: sex, status-seeking and security, to name a few.
These markets thrive on memetics and the directed attention of organized groups, not on the self-sovereign, free-thinking individual. If this bleak outlook holds true, any technology linking individual decision-making with open information systems will inevitably serve those who stand to gain the most—corporations, governments and increasingly artificial agents. The primary beneficiaries will not be the current reader, the author or even most people using these technologies today. Instead, the most likely winners will be artificial agents—operating behind the scenes to optimize for goals that may alienate the humans who created them. This is the trajectory we’re on unless we act decisively.
We need a biometric privacy framework
Most agree we need privacy. But despite frameworks like GDPR, CCPA and landmark efforts like Chile’s Neuro Rights Bill, the same problems persist—and the risks are accelerating. Regulation and policy focus on these issues but are insufficient without concrete implementation.
What’s missing is a foundational embedding of digital natural rights by default into the infrastructure powering the internet and connected devices. This starts by making it incredibly easy for individuals to create and maintain self-custody of their own cryptographic keys. These keys can secure our communications, authenticate our identities and protect our personal data without relying on a corporation, government or third party.
Holonym’s Human Keys offers one such solution. By enabling individuals to create cryptographic keys safely and privately, we can protect sensitive data while ensuring privacy and autonomy. Human Keys shine in that no single corporation, person, agent or government needs to be trusted to create and use these keys.
When we integrate technologies like homomorphic encryption with devices such as the Apple Vision Pro or Neuralink, we have the potential to enhance cognitive abilities without compromising the privacy of sensitive user data.
However, it’s important to recognize that software alone is not sufficient. We need secure hardware that adheres to publicly verifiable and open standards. Governments play a crucial role in ensuring that manufacturers follow strict security practices when creating devices that handle and store cryptographic keys. Just like clean water and breathable air, secure hardware for storing keys should be considered a public good, with governments taking responsibility for their safety and accessibility.
Looking towards the future of ethical neurotechnology, we must heed the warnings of visionaries like Gibson who cautioned against technology eroding privacy, autonomy, and humanity. Brain-computer interfaces (BCIs) have the potential to expand human capabilities, but only if guided by ethical principles. By integrating biometric privacy into the core of our digital systems, utilizing tools like self-custodial keys and homomorphic encryption, and advocating for open hardware standards, we can ensure that these technologies empower rather than exploit individuals.
The vision for the future does not have to be a dystopian one. Instead, it can be a landscape where innovation enhances humanity, protecting our rights while unlocking new opportunities. This vision is not just optimistic—it is imperative for shaping a future where technology serves us rather than controls us.