Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai doesn't use roads robots don't buy cars why would they fix roads for a popula…
ytr_UgyaW5yap…
G
GOOD GOD IF ANYONE IS HAVING A BRAIN SPASM LIKE ME,
I'm certain they are saying…
ytc_Ugzp3OiGj…
G
Random mutation and natural selection designed humanity's aversion to pain and d…
ytc_UgjERjmDB…
G
So entry level positions are going to disappear ... if you don't have employees …
ytc_UgwcUGbMP…
G
Why would you be in there, if the truck is self-driving in the first place? And …
ytr_UgxwhEwak…
G
Thats why Jesus is the answer not just a belief in god. Without Jesus, your foun…
ytr_Ugx8FpjXu…
G
short answer to the title question: Yes, eventually, because it will think and …
ytc_UgyluuMGc…
G
Conscious AI can never exist. Only biological structures can represent human lev…
ytc_UgzPbDSm4…
Comment
The fact that modern AI values self-preservation even when explicitly programmed not to is enough for me to think we might need to start seriously considering redefining our definition of "life". Objects and tools don't have self-preservation instincts, but even single-celled organisms, the simplest of all life forms, do have them, and AI is showing that it also has them.
youtube
AI Harm Incident
2025-09-11T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxc_zWDAjxuCXKgTCJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbqvzHXbij4H6glMt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHTUroYR0mC1jBAb94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCOotT2AbqT_YAp-V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxfQyRsE9_iUs5EMRd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyyhTDgYktePRmqoB54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyD1STYuqQZfQqFcy54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqlIXQJ3k3xxYS89h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJSgKQpwRIXfY0Udx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9AHevMhBNG4Vypxx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]