Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The caption shouldn’t be he scares everyone — it should be the reason why ai s s…
ytc_UgwQc_Ngt…
G
So this is a private school.... I hope. I mean they're using AI to teach the st…
ytc_UgwNPPqbT…
G
BRO KEEP GRINDING BROO, Appreciate your content bro, been a cold caller to new y…
ytc_UgxLZE5st…
G
When I went to art college we were made to copy other works and styles, and I am…
ytc_UgyKf80D1…
G
Imagine flaming somebody for learning a foreign language when you could just goo…
ytc_UgxsI7VmU…
G
Well yes... if you don't want to train AI don't post art. That's kinda how it wo…
ytr_UgzXz2Tqa…
G
@Chaki21 I think you're right, overall, but I also think that it will take many…
ytr_UgwjgT2xg…
G
Great point of view. Too many creators go for the lowest hanging fruit of just m…
ytc_UgwkwEprE…
Comment
You people are all losing the plot of this.
It's not like the drone will be operating with no authorization, if it decides to kill someone in the area that was deployed in, this is not how it works. When human deploys the drone into the area, the human *automatically* gives it authorization to kill all the enemies in the area because the drone operates independently. Same situation if you deploy a squad of people and you don't keep communication for security reasons, so they're operating in the dark, and they have orders to hold some area and they spot enemies, and they engage them and kill them. Does it mean that they operate with no authorization as well? No, you deployed them, you gave them authorization
youtube
2024-08-13T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzxGKMkgqoV0pBS8p54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzia2K7U580HJ4Lycx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRhPADt1MShC9qPAh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxUZqMDwEFLj6P8wAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwgNL5xYsiwehxT8V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwWwlICTpxuMS3GrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwU3oSiRNJiPafd4g14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwyYFZInXTOR1ztDlB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyzw5bwZk_3BE2LcSt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2u_taAI4q-n-QQIN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}]