Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This AI stuff reminds me of an old (1950s) sci-fi story The Darfsteller where a …
ytc_UgzL9sTpW…
G
@Game_Boi11 Thank you for your comment! While I appreciate your concern, rest as…
ytr_UgzZ_Ubp-…
G
> The police report suggests the car's driver was streaming an episode of tal…
rdc_e13nvb5
G
I work hard on my dandy's world art AI took it. at the end is me…
ytc_UgyNXBbOX…
G
@thefrench8847AI Art is NOT Art it will NEVER be better than real art made by re…
ytr_UgyZN-gwe…
G
Giving someone $6B with nothing to show isn’t foolish? These people are going to…
ytc_UgwF5TCNY…
G
"OpenAI was established to promote AI safety" - liar. It was established to mak…
ytc_UgyHoMtvQ…
G
It will with time. AI destined to rule us all at some point. I am already accept…
ytc_UgxOwypvu…
Comment
it's like a homing bullet that can think... i.. i do not know what to say.. it is wrong in too many ways. "Let the weapons make the decisions"? :s I know it's a big kliché but what's the difference between this little flying death machine and the bigger flying ones in the latest Terminator movies.. i'm not saying skynet is breaking loose but an A.I flying on it's own and have the ability to make decisions to kill us or not..
youtube
AI Harm Incident
2018-07-16T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx52UIh4q-wnMsrGXJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw9rwdtMQKPqDKq1DB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxkvMNNZxYsywghT-l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJjoqKOFaoo1t8hFd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgIdfqerUueC4zg4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyq9S3WVP0ygVTntUt4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtMPjnstLik54ctOZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsAz9-1vws6xduiwx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzOl6lw2AjNmgdwVX14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzI_hHUzfkKE4QQb8h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}
]