Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Funny thing is in 5 years ai will be 100% useful on ur apple laptop. No need for…
ytc_UgyUtosPs…
G
Self aware AI shouldn't be developed to begin with in my opinion, since the poin…
ytc_UgiC1pPPo…
G
Recently i had a conversation with my mother about ai art, and i expressed my di…
ytc_Ugw3Uq1B6…
G
Probably but do u want to continue to living in a system where in order to survi…
ytr_UgxE3mNRm…
G
"It's crazy that you mixed up bromide with chloride"
ChatGPT: You’re right — tha…
ytc_Ugy__iGMW…
G
Very interesting! 💥 My question is, can we meaningfully distinguish between repl…
ytc_Ugz4WwFmR…
G
Self driving cars will be very helpful, and will do wonders for peoples drivi…
ytc_Ugz9eHjSF…
G
If you work in manufacturing, you’ll be replaced by a robot. If you work in an o…
ytc_Ugyy5Gqr3…
Comment
The tech is already hear, little more work but essentially here. Elon Musk stated he fears more AI than nuclear weapons - with good reason. Seen Boston dynamics? Bipedal AI operated robots? Seen the f16 drop off a cluster of miniature robots that are AI operated as a team and will conduct command operated instructions to counter enemy targets? Science fiction is becoming less so and it’s actually here
youtube
AI Harm Incident
2022-06-11T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyyhFjCIOocUiTP4_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP5yg-Rs6VPWF8WHF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5w1oXPl4D_Ad0sdB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyL67H2JZF3yLvNyc94AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyDPSsBhH4jiMneJLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9TFdlVJJD_iJd9mN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrvCVdEKQBBkQH3PJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7yqEy8FQ11t1HNpJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6w9apDTS3sopB45d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCK274GKKIE8eGhUB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]