Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is exactly how Apple Intelligence wraps your requests in prompts towards th…
ytc_UgxIqvNkg…
G
some of these people about to lose their way of life voted for this to happen, a…
ytc_UgxJNVP7U…
G
My problem with AI is the built-in bias - the old GIGO thing (garbage in garbage…
ytc_Ugzgea4gX…
G
I do live visuals for concerts, and AI generated video is super prevalent. Imagi…
ytc_UgxXTGMFa…
G
I think we will all cook when AI energy needs heat this closed system to the boi…
ytc_Ugybyi6TT…
G
But don't you ever take any of your 3D screenshots and mess with them. Your styl…
ytr_Ugxnw7uEF…
G
AI is used as a convenient excuse to fire people, these jobs are actually being …
ytc_Ugy2CmzZP…
G
the elephant in the room is that by far the most monetizable aspect of ai is cor…
ytc_Ugxc47eHM…
Comment
asking the wrong questions, the main question, what happens to humans when AI and robots take over and work for us, but we make no money to afford these things. The new butler robot costs $20,000, the telsa car costs $100,000, only 10% of the entire US can afford that, and still live comfortably. What happens to the rest of us and the future if we can't afford it. More important, when the mobile ID kicks in, and you MUST have it to buy food, what will happen to most of us. It's like someone sat down, watched all the futuristic movies ever made, and decided, I want to make those things come true, without pondering the consequences. Be careful, we either become Galaxy Express 999 or Cloud Atlas.
Sometimes the best choice, is to choose not to play at all. How about a nice game of chess.....
youtube
Cross-Cultural
2025-11-14T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzCgiLrGshOrXiuFOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0T-N6DIC_CetKEah4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5oYZ0RhCL5n2DAnJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyd7-G37GD5gFwJUZV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFqxqkCGcix7WP5gF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxQtPUw0-9v5ua176l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHUmeOwYPdRJV9Fu54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKdqM8kuPPhV5WXTp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzxFONxRfWv5ibSs0N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKk7_zR0Yx_2JFHWx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]