Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Auto pilot is not the same as the current supervised full self driving software.…
ytc_UgxmtXeWk…
G
What about the concept of humans using AI to kill other humans? And it doesn't h…
ytc_Ugw2D7nZo…
G
Hey bro. Want to team up? Getting an ai integration biz off the ground rn…
ytc_Ugx9eWwFX…
G
If the AI does make it so that we have all our needs met, that would be good!…
ytc_UgxZ5SO_N…
G
Ai?
r u thair
i qiestion the validity of content being reprised to thee... Q?…
ytc_UgxbSmfan…
G
I think a lot of the time, the code generated by ai involves the ai making a lot…
ytc_Ugx2kNNG7…
G
@Mr_Nerelevantny well yes because one wasn't generated or trained on stolen tal…
ytr_Ugx0DQEhX…
G
If AI will replace everybody and place every human in unemployment because AI wi…
ytc_Ugw-w1o0n…
Comment
I LOVE this! I keep telling my company using AI is unethical but it keeps falling on deaf ears. I want every artist to do this! Make this stuff unusable! I hope people can to this with stolen text content too, that’s how my company uses AI most and I want gone sooo bad.
youtube
Viral AI Reaction
2024-10-20T20:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz653Ny6AqdsvS9fBh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxiBIYyaMjJYgHyt5p4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz73T4xibXEWIXfo_t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyscK7IoVIGY_DXI614AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"amusement"},
{"id":"ytc_Ugye6rI2Ec8bkIK1xRZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx10zuAutSzhqs-OoJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx1VIzPNI6i5ZCNegF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzsv-pLrBQdwlLI6bV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUMs-_KK3RQlwpQIV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy0WaW4f9EYOgUHgzt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"}
]