Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We have a number of asteroids traveling through our galaxy right now and it will…
ytc_UgyAOz28t…
G
And after all the jobs are taken by AI, are the robots and computers going to bu…
ytc_UgwF_AkV9…
G
I drive a truck. Good luck AI handling that. Thank you Elon for proving for almo…
ytc_UgzZf8PiP…
G
Some great tips there for seeing fake AI. Handy for when I'm not sure if I'm wat…
ytc_UgzpqnEr8…
G
This is literally what Marx spoke about. Its built into capitalism as an inheren…
ytc_Ugy6OjNJf…
G
He’s talking all this and he’s gonna launch his own version of AI. He’s a little…
ytc_UgxK1Z0MW…
G
I would, and do, literally expect people to cause death in order to profit from …
rdc_grs2atw
G
I feel like there's a bit of a double standard being applied to Tesla here. When…
ytc_UgxQ4DmOO…
Comment
0:09 lol, is this a psyop?! In defense/war business "AI" is nothing but a buzz word, they use to call it algorithms, it has nothing at all to do with LLMs or image generation. Sometimes they hide behind it like "AI analysis told us to strike that target" so they don't have to take responsibility but it really was just negligence.
Or what other AI are we worried about? Some machine learning algorithm that will follow drones lead by one polite or something? How is that bad?!
The whole AI will take over the weapons idea is pure fiction.
youtube
Viral AI Reaction
2025-11-23T22:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzQz1IPE8tbMROs0ZZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyEOhDc8JGxf4bVKDp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyiSdB1hRKBShRGvt14AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwUD9YCy6SXIWlajR54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzyByMbLscB6dtIDK94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx0ZlobxqFdctlYQOt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz2a_Co3pq5136Twzx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxtfUZMaiW2-Lz8OMN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwhyd0FXUd38Z2BbNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxIkE3h4f7yGd4sfqZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]