Type: Opinion
Authors: Kevin Owocki
Originally published: Allo Capital Research, May 2025
TLDR - Human attention is our most precious cognitive resource, processing only 50 bits per second from millions of inputs. Tech companies exploit attention vulnerabilities through algorithms designed for engagement rather than wellbeing. Understanding these mechanisms empowers us to reclaim cognitive autonomy through mindfulness, environment design, and community support — transforming attention from a passive target for exploitation into an active force for self-determination. Let's reallocate our attention.
The Human Attention System
Human attention represents one of the most sophisticated cognitive systems to emerge through evolution and also our most limited mental resource. Our sensory systems constantly bombard our brains with approximately 11 million bits per second. Attention serves as the ultimate bottleneck, allowing only about 50 bits per second to reach conscious awareness — a filtration ratio of roughly 220,000:1.
The system operates across two major domains:
- Bottom-up (stimulus-driven) attention: Automatic, unconscious response to salient stimuli — a bright flash, sudden noise, or movement. This evolved as a survival adaptation.
- Top-down (goal-directed) attention: Our ability to consciously direct focus based on internal goals, like searching for a friend in a crowd or concentrating on difficult text.
The constant negotiation between intentional focus and automatic capture represents the fundamental struggle for attentional freedom in our technology-saturated environment.
Human attention is also deeply influenced by emotional valence and personal significance. Objects with emotional resonance capture attention more readily — creating predictable vulnerabilities that attention-seeking technologies deliberately target.
The Transformer Revolution: When Machines Learned to Focus
The 2017 paper "Attention is All You Need" introduced the Transformer architecture, revolutionizing natural language processing. The core innovation — self-attention — allowed models to dynamically focus on different parts of input when generating output.
Self-attention operates through mathematically precise calculation of relevance. For each position in a token sequence, the model computes a weighted sum of all other tokens' representations, with weights determined by relevance. Unlike human attention which evolved for general survival, Transformer attention was specifically designed for processing sequential data.
Key differences from human attention:
- Parallel processing: Transformers compute attention across all positions simultaneously, while humans generally focus on one thing at a time
- Multi-head attention: Multiple independent attention mechanisms run in parallel, each focusing on different relationship aspects
- Learned through training: Weights adjusted through exposure to vast text data, without explicit linguistic instruction
Where Human and Machine Attention Converge
Despite radically different origins, both systems share striking similarities:
- Selective prioritization in information-rich environments
- Contextual awareness influencing processing
- Dynamic adjustment based on input
- Pattern recognition driving sense-making
- Balance of local and global attention
These parallels suggest certain attention principles may be fundamental to any system processing complex, sequential information efficiently.
The Unbridgeable Gaps
Profound differences separate human and machine attention:
- Consciousness: Human attention is inextricably linked with subjective experience. When a Transformer "attends" to a token, no corresponding subjective experience occurs.
- Emotional drives: Our attention naturally gravitates toward stimuli with emotional significance — no direct equivalent exists in LLMs.
- Embodiment: Human attention integrates multiple sensory modalities simultaneously. LLM attention operates only within text.
- Agency: We consciously choose what to focus on based on values, interests, and intentions. LLMs have no intrinsic goals or capacity for intentional direction.
- Temporal dynamics: Human attention naturally fluctuates in cycles of focus and diffusion, with no equivalent in deterministic computation.
Embodied Attention
Human attention involves not just the brain but the entire body. When we attend to emotionally significant stimuli, our bodies respond with changes in heart rate, skin conductance, pupil dilation, and hormone release. These bodily responses don't merely accompany attention — they actively shape it through feedback loops.
The emotional dimensions create attentional patterns impossible to replicate in disembodied systems. Brain regions like the amygdala identify emotionally significant information before conscious awareness, triggering rapid attentional orientation. This explains the cocktail party effect — instantly noticing your name spoken in a noisy environment.
Practical applications appear across fields: mindfulness practices leverage the body-attention connection using physical sensations as anchors. Educational approaches incorporating movement improve attention outcomes by aligning with the embodied nature of cognition.
Reclaiming Attention
As attention-capturing technologies grow more sophisticated, the liberation of this scarcest resource becomes not just a personal wellness practice but a profound act of self-determination. Understanding how our attention is exploited is the first step toward reclaiming cognitive autonomy through:
- Mindfulness — Training intentional focus through meditation and awareness practices
- Environment design — Actively curating information environments rather than accepting algorithmic defaults
- Implementation intentions — Setting explicit goals for attention allocation
- Community support — Building collective practices that value depth over engagement
The parallel evolution of human and machine attention systems offers unique insights into both how our attention is being exploited and how we might reclaim ownership of this most fundamental resource — the very substrate of our conscious experience.



