We often speak of robots as an abstract threat: they might steal our jobs, watch our every move, or replace humans entirely. Yet, behind every robot in a hospital, warehouse, street, or school, there are very human decisions: an official, a director, an industrialist, a public service, a buyer.
The real question might not be “Should we fear robots?” but rather: “Who should we watch when robots are deployed in society?”
Robots or Deployers: Where is the Power?
A robot “wants” nothing. It does not wake up in the morning intending to replace a cashier or surveil a neighborhood. It executes what it was configured to do: patrol, transport, clean, assist, film, measure, report. What changes everything is the context:
Consider three scenarios:
- Supermarket Cleaning Robot: A practical tool, invisible and accepted.
- Park Surveillance Robot: A control device, potentially intrusive with cameras and mics.
- Public Service Reception Robot: A question of accessibility, inclusion, or human substitution.
The same technology can serve to relieve or to surveil, to protect or to exclude. It is the deployment choice – not the robot itself – that carries the political responsibility.
A Robot in the Street: Who is Responsible?
Imagine an autonomous delivery robot circulating on a sidewalk and injuring a pedestrian. Who is responsible? The manufacturer for a design flaw? The software provider? The operator? The city? In practice, responsibility is often shared:
- Technical Responsibility: Design, safety features, updates.
- Operational Responsibility: Human training, procedures, authorized zones.
- Political Responsibility: The decision to authorize, regulate, or limit usage.
For the citizen, these layers are invisible. We see "a robot moving" and ask: "who decided this for me?" This is where robotics meets citizenship: demanding accountability on who deploys what, where, and why.
Why the EU Regulates First
In Europe, the trend is clear: regulate first, normalize second. This isn't to halt innovation on principle, but to set guardrails: to avoid opacity, protect data and dignity, and define risk levels.
Concretely, deploying a collaborative robot in a factory, a service robot in a nursing home, or a surveillance robot in public space does not carry the same level of requirement. Europe wants to avoid a scenario where robots are imposed on citizens as simple gadgets without debate.
"Robotics does not 'fall' upon society like weather. It is the result of human, economic, and political choices."
Will Robot Take Our Job?
Share this article