Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Then why did you create that shit if it's so dangerous. I more and more think th…
ytc_UgyydyrWx…
G
Not even an ai problem, its a lazy idiot problem, these people are supposed to b…
ytc_Ugxydi1KJ…
G
20!yrs from now I don’t think technology will exist
Inshallah this Ai technolog…
ytc_Ugy6j0M1X…
G
The big concern I still have about driverless vehicles on our roads is that they…
ytc_Ugwp9XhcZ…
G
I am not a gifted artist, ive spent my whole life pretty much drawing, and most …
ytc_UgyR3vUpD…
G
He said they won't unplug him because they want him to put on a good show!!! But…
ytc_Ugya7PyGL…
G
Holy geez 68.
In canada I think 37.5hr is standard... and even that I find long…
rdc_dv0yt0z
G
I believe the biggest regulation would be putting a cap on the capacity of data …
ytc_Ugx_tX490…
Comment
In a time before we had a new "Dead Internet Theory" video popping up every week I would use a robot emoji to call people on Facebook bots when they were totally unable to get that something is a joke.
Relating to that, you seem to also not see that those skits you showed clips of at the beginning all seemed very sarcastic. I'm pretty sure almost all of those people are making more of a joke about the concept of robot racism and mocking the extreme AI haters than being truly genuine.
Maybe because I'm between the luddites and the cogsuckers (AI-lovers) I have the perspective to see what's going on.
youtube
2025-10-15T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz4QudItgCPxoMCH8l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugzh4tfFhWFAtwQr8A54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzgq6KqJfStDzXXasN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxs2Afx2B9ZJMFLH1t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxbUmIwhHYemln_fb94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwHVs9vUsWwaPWtskJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwubtoYc80dIoNTFyh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzOi86OEZHgZQ7XFRJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx2L7Qi7aOfYNaUkI54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweBCjW8lO9FX9Mi694AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]