Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks for your comment, @BOSS-ky1lzname! Your perspective on artificial intelli…
ytr_Ugx8QISrO…
G
So they stole Google's self-driving technology files, but forgot to also steal t…
rdc_f6x8xwy
G
@MisterMixxyThanks? Imagine being so terminally online that you consider basic …
ytr_UgyjSg0eW…
G
AI is comparable to the summation of every single revolution to date. It will be…
ytc_UgzOEfNLE…
G
A machine with unequaled intelligence…. Yeah it’s definitely gonna act like a hu…
ytc_UgzF0Fr87…
G
I would LOVE to see a "closed-reference" generative ai, where a former artist tr…
ytc_Ugx3J9sky…
G
If you’ve seen/read Cloud Atlas, the Sonmi chapters, then that’s a pretty good i…
rdc_ljb6yu9
G
I’m gonna just say that you can do analysis of a user yourself for like 2 minute…
rdc_n0hm6vr
Comment
In a test of robot performance maked by AI, the robot decided to kill the monitor who cancelled the mission because the robot concluded that finishing the mission were more important. But it was programed to never kill her monitor. Do you she the dangerousness of making robot killer?
youtube
AI Harm Incident
2024-06-29T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzQYMLn3iJ7ojh9khJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwCs6WwGwF6LpGQw_J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzWSdv9HD9ULjw9Kmh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyBPYjcl8iGMoMthdp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugww2TpozfAPi4Y-DHx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzfHg5QCxKMtgj33Rt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJOGL2sD4AJQNwwtd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxzBdIzqZv5YGY7PNN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNDBeyZl6WeO56-kN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxgzQ5bZR--Tq58QN94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}]