Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For those who compuse how meta ai say wrong answer here the answer first he requ…
ytc_Ugw20uHBi…
G
Bro why would they just use AI to do their own work go to youtube or Pinterest a…
ytc_UgyzAIYRI…
G
Very well said. Artists have intention with their choices in how to present thei…
ytc_Ugze0kyWx…
G
This guy is a fraud. He's pretending to be some kind of whistleblower but he's j…
ytc_UgzPBSBWN…
G
Bernie, plz, who cares about working class when AI can wipe out humanity! Low, u…
ytc_UgxHJdQnN…
G
He’s spot on. I run my business with ZERO employees and use Anthropic for nearly…
ytc_UgyF1x7GI…
G
Well obviously they deserve rights, if they are conscious like us, then it would…
ytc_UgyNPJS6E…
G
AI is like the switch from a manual drill to an electric one. You still have to …
ytc_UgydLO5Nu…
Comment
of couse a soldier needs to think not all soldiers are brainless drones and the government or the military may not care about civilians but a human being will care about the civilian casualties soldier or not, what i am trying to say is how will a robot see the difference between a child, woman or combatant they can`t, a human being on the other hand can, we might not always make the right choices but we cant let a AI make them for us.
youtube
2012-11-23T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWyB_WdWgncQqmJtx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx042Ne8UlXAF9_01l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGZKq-ZmNlstUi3-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzKI2oHu3nLGoZ6-sd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzQ0osU3HXzJkeJlJZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxjmvwc0z2yufpgD2V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw79eYAz52yiIYoNXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQLY94VS6RzuJvovJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzonnFIo53uKhEDHch4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd8XKoFWYSfqF68vV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]