Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Emailing my senators soon, I don't want to die becuase trump wanted to post a ai…
ytc_UgxzulXR3…
G
AI is for dumb, slow, lazy ass people who want to be controlled. centralized thi…
ytc_UgzW92J6O…
G
I like to say that I am using AI to do x but it introduced some bug y and it too…
rdc_o8a9o6z
G
The extinction probability will never be as low as you'd like because an intelli…
ytc_Ugz4pH8q5…
G
You must know how to code to effectively use AI to assist you in building comple…
ytc_UgwyWihaN…
G
This is why we need HUMAN drivers. They have the ability to realize that all lan…
ytc_UgyB4S4pg…
G
Yeah. When I heard this story at first, I was like "okay, putting bromide into y…
ytr_UgyhqIeN5…
G
Hahaha i can escape that robot bro i am going to parkour to the building…
ytc_UgyPuaqxl…
Comment
Maybe an extinction level event?
Put AI on the shelf for ,mmm, 500 years should be enough.
To put another way, it's like teaching a 5 year old to drive a semi, then handing him a license and a carton of smokes on his way to his first gig.
youtube
AI Moral Status
2025-07-26T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyLDnUZ6EASeji3-8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzq0kRqPb0kSH5HNNJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyLmNsPgfE_92xP_xh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwZ7XsJJJMZcVwxkCJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzVZLYnTJZauV-ycxV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxeRMY7tU4miCo-VlJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBmeOPXFsHg05QK9J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwU1NpUwqYges2nZN94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDetTJVW_Qlsal8-V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzXKi4E6XTJsxH03Vp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]