Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can't draw that well traditionally, I'm mainly a Digital artist. Digital art i…
ytc_UgyvzH3Tu…
G
I am an NSFW Artist . And No My clients don't like NSFW AI BS .…
ytc_Ugx0-qNMM…
G
You better adapt to the times, if you don't you are going to be left behind, AI …
ytc_Ugx9ifwJA…
G
Every time someone outside of the AI space makes a video about it, I worry it's …
ytc_Ugy0UVrIM…
G
1. You are one pathetic defeatist.
2. AI is not a sapient entity or a force of n…
ytr_UgzREamaL…
G
This is just nonsense. If your AI suddenly decides to take over the world, you j…
ytc_UgyOq2S9H…
G
You are this close to get the ban list for using the a.i. and not telling the tr…
ytc_UgxiQXcva…
G
come on, AIs are getting better and better at driving...I'd rather be in traffic…
ytc_UgwZp6vqo…
Comment
@DrVinnieBoombatzDO
"...would anyone put it in control of to warrant that fear..." - Yes.
The multi-billion $ company I worked for (I'm retired now) is a global leader in logistics, thousands of little AI controlled robots running around doing what they do "advanced Automation".
We let them "put them in control", yet a few years ago one of the little fellers overheated and eventually burst in to flames in the heart of the machine, three days to put the fire out, two years to rebuild.
My question revolves around not a misinterpretation of AI being intelligent, more the speed or reaction times to prevent the "advanced Automation" causing damage or worse. I don't expect an answer, so don't worry yourself - there isn't an answer apart from "human error" possibly more sensors on the sensors.
youtube
2026-02-12T03:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugz6hAxwZPcrIG2EOYJ4AaABAg.AT61IrQAc9UAT6vCdABkOn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz6hAxwZPcrIG2EOYJ4AaABAg.AT61IrQAc9UAT7-paMZtXR","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz6hAxwZPcrIG2EOYJ4AaABAg.AT61IrQAc9UAT74ZjugdXy","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxvIl8M_Sp5SsFk0z94AaABAg.AT6-ieWj4BDAT6JbFT-yoN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxvIl8M_Sp5SsFk0z94AaABAg.AT6-ieWj4BDAT6RCngpvbQ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxvIl8M_Sp5SsFk0z94AaABAg.AT6-ieWj4BDAT6_X58tPm1","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxvIl8M_Sp5SsFk0z94AaABAg.AT6-ieWj4BDAT6bOCpbatH","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxvIl8M_Sp5SsFk0z94AaABAg.AT6-ieWj4BDAT6d_DLgElj","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzGGTmzUNHxYUgTdw54AaABAg.AT6-NSr4R9OATekkYXPCJM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy52cHWF6LLbTD1TyN4AaABAg.AT5zrfl-eiKAT7u8r0SZB7","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]