Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And this is the kids of stuff that ai hacking or AI social engineering will be, …
ytc_Ugw2NPUEw…
G
They often try to say that "AI art is hard", then someone made a video where the…
ytr_UgyF3GQFA…
G
The graphic looks like that creepy judge from Florida, Canon or whatever, the on…
ytc_UgzI4MeMu…
G
As someone who recently started messing around with LLM APIs in my projects and …
rdc_mleethp
G
One thing an artist is doing is uploading two images that connect to form a full…
ytc_UgxIUPuLy…
G
I *hate* this type of AI technology. As soon as these things are given access to…
ytc_UgzK2zSZW…
G
AI could be harmful ::: IF covert fed bad information in a way that the processi…
ytc_UgypxIU27…
G
Has anyone considered the fact that the owners of AI's are corrupt themselves, a…
ytc_Ugz2ciAaQ…
Comment
Automated robot soldiers would be the perfect gun for a government. A government would no longer need to train humans to be killing machines. A government can implement these for possible civilian security meaning no need to have human police. These machines will happen, because humanity is raised to be insane.
youtube
2012-11-23T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWyB_WdWgncQqmJtx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx042Ne8UlXAF9_01l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGZKq-ZmNlstUi3-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzKI2oHu3nLGoZ6-sd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzQ0osU3HXzJkeJlJZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxjmvwc0z2yufpgD2V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw79eYAz52yiIYoNXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQLY94VS6RzuJvovJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzonnFIo53uKhEDHch4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd8XKoFWYSfqF68vV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]