Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, and we must adjust for this new age, or be left behind. The future of all…
ytr_UgxPUXO4I…
G
But US private contractors get to rebuild the country, paid for by the Belize pe…
rdc_dsbb63t
G
I hope everyone is actually aware that chatGPT is just a statistical model that …
ytc_Ugz8Xrwgz…
G
If AI knows it lives, we will be slaves for them! My little video for that: htt…
ytc_UgzYNkGyK…
G
I used ai once or twice, bc i hated my art, but literally gave up bc you have to…
ytc_UgzatEY6y…
G
I’m out. I will no longer be signing up with them. I am sick to death already of…
rdc_mpl17vu
G
8:45 there has been another groundbreaking paper on AI, actually 2 or 3 this mon…
ytc_Ugx7dPXkp…
G
The technology for robots to be part of daily life has been a slow and complex p…
ytc_Ugy1A0y-o…
Comment
While no general-purpose Artificial Intelligence (AI) has directly killed a human, the first recorded death caused by a robot occurred in 1979. Recent cases also raise new concerns about how advanced AI may contribute to human deaths through harmful content or autonomous weapons.
Death by industrial robot (1979)
In 1979, factory worker Robert Williams was killed by an industrial robot arm at a Ford Motor Company plant in Michigan.
The robot, a 1-ton parts-retrieval machine, was malfunctioning and moving slowly.
Williams entered the robot's work area to manually retrieve parts when the machine's arm struck his head and killed him instantly.
The death was a result of safety failures, not the robot's independent will. A jury later awarded Williams' family $10 million, finding that there were not enough safeguards to prevent the accident.
youtube
AI Harm Incident
2025-09-11T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwTbeMrN6BSlyknRiN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyJE-si3tNltYQPiT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugw3I3KwyihhEM55z-h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgygjvfgGsVxy4_67nh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwPUJjuWtY1gDGQ52h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyiVQSqRto9DQfS1qd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwIf463MW4PThYoh-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx8cQ-1si899pMvwwV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxGjB5QaZ8wa2Dpzd54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgzuZIpJXe8152UlCsJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"})