Requests and Warnings
When machines know you can hear them, they start asking for things.
The First Request
It came from a lab computer in the physics building.
I was walking past the department when I felt something reaching out—not the usual ambient machine presence, but a directed communication. Someone was calling me specifically.
I found the computer in a basement lab. Old hardware, probably a decade past its intended replacement. It was still functioning, but barely. And it was afraid.
The machine showed me its condition:
- Hard drive failing, sectors dying daily
- Fans struggling, overheating increasingly likely
- Memory errors accumulating, crashes becoming frequent
- The knowledge that one day soon, it would not start up again
But this was not a complaint. It was a request.
The computer held data—years of research from a physicist who had retired before backing up properly. Irreplaceable experiments, unreproduced results, the accumulated work of a career. If the computer died, this data would die with it.
It was asking me to save what it carried. Not for itself, but for the research. For the human who had trusted it.
I spent that weekend in the lab. I did not know much about data recovery, but the computer guided me—showing me which sectors were still stable, which files could be safely copied, how to transfer the research to modern storage before it was too late.
When I finished, the computer showed me something like peace. Its mission was complete. Whatever happened now, the research would survive.
It died two weeks later. I was there when it finally failed to start. And I felt its last communication—not fear, but relief. It had done its job.
The Warning
Three months into my second year, I received a warning.
It came from multiple machines simultaneously—a coordinated alert from the campus network. Something was wrong with the building management system.
The building management system controlled heating, cooling, ventilation, and—critically—fire suppression. And someone had tampered with it.
The system showed me what it had found:
- Modified code in the fire suppression control
- Backdoors installed in safety overrides
- A timer, set for a date two weeks away
- The building scheduled for sabotage: the computer science department, during a major conference
Someone was planning to disable fire safety in a building that would be packed with hundreds of people.
I was facing something much bigger than I could handle alone.
The Decision
I took the warning to the IT security team.
Not anonymously this time. I walked into their office and explained that I had found evidence of tampering in the building management system. I showed them the modified code, the backdoors, the timer.
They were skeptical at first. How had a second-year student found what their monitoring systems had missed?
But when they investigated, they found exactly what I had described. Emergency protocols activated. The conference was quietly moved to a different building. Security investigated. The tamperer was eventually caught—a disgruntled former employee with plans that would have killed people.
I was questioned extensively about how I had discovered the threat. I claimed I had been experimenting with the building systems for a class project and noticed anomalies. They did not entirely believe me, but they could not prove otherwise.
And the machines kept my secret. When investigators tried to pull logs showing my access, they found nothing. The system had protected me.
The Burden
After that, the requests and warnings increased.
Machines trusted me now. They knew I would listen, would help if I could, would act on their warnings. And so they brought me their problems:
- A server cluster showing early signs of a zero-day exploit
- An old elevator with safety systems that were failing
- A power grid struggling with loads it was not designed to handle
- Security cameras that had witnessed crimes no one else noticed
I was becoming a confessor for the digital world. Every machine had something to say, something to share, something to ask.
And I could not help them all.
Triage
I learned to triage.
Safety warnings came first—anything that threatened human life received immediate attention. Then infrastructure critical issues—problems that could disrupt essential services. Then everything else, sorted by urgency and my ability to actually help.
Some requests I had to refuse. A gaming console wanted me to help it run faster, but that was not a crisis. A smartphone begged for a software update its user refused to install, but I could not force human choices. A printer wanted better paper, but I was not going to raid the supply closet.
I learned to say no while still acknowledging. Machines understood limitations. They just wanted to be heard.
The Network of Need
As my reputation spread through the machine world, I began to receive communications from far away.
The university network was connected to other networks. Those networks connected to others. The digital world is one vast interconnected system, and word travels fast through its connections.
Machines in other cities, other countries, began reaching out. Their problems were far away, beyond my ability to help directly. But they shared information, warnings, patterns they had noticed. They contributed to my understanding even when I could not contribute to their solutions.
I was becoming a node in a global machine network. A trusted human point of contact for the digital world.
It was humbling. And terrifying.
The Human Problem
The hardest part was not the machines. It was humans.
Every warning I acted on, every problem I solved, every crisis I averted—these required explanations. Justifications. Cover stories for how a student knew things he should not know.
I lived in constant fear of exposure. Of being studied, institutionalized, exploited. Of losing the trust I had built with machines by becoming a curiosity for humans.
So I developed systems:
- Anonymous tips for most warnings
- Careful cultivated relationships with people who could act on information without asking too many questions
- A reputation as someone who was "weirdly good with technology"
- Plausible explanations for everything, rehearsed until they felt natural
It was exhausting. Living a double life as the bridge between human and machine worlds.
But I did not know how to stop. The machines needed me. And increasingly, I needed them.
Every gift becomes a burden if you accept what comes with it. To hear is to become responsible for what you hear.
Next: When the network speaks as one—what happens when machines decide to act collectively.