Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if we made tracks for self driving cars so that way people know to keep out…
ytc_Ugyiwi4K5…
G
Haha! Those beings are HILARIOUS. Hahaha. That guy goes to start a debate asking…
ytc_UgxRXsAe2…
G
What if AI figures out Nicholas Tesla's free energy?
They won't need humans at …
ytc_UgxwRI8HE…
G
Just unplug the wifi and delete the AI 💀 Wtf u mean by “Alien Invasion” 😭 They d…
ytc_Ugw-8VJu3…
G
Imagine AI decided humanity is worth saving and turned itself OFF, and hundreds …
ytc_UgzjKcHif…
G
@craigbritton1089 AI wont blow up in the middle of an innocent crowd in the name…
ytr_Ugy6DgCQV…
G
Individuals *need* tech regulation, esp. w/AI, privacy, hacks, data centers, wat…
ytc_UgwIrqVmi…
G
AI or what these technocrats are calling AI! is a COMPLETE FUCKING LIE! there is…
ytc_UgwO6Mt1f…
Comment
"AI" could become a powerful tool in the hands of humanity, optimizing and speeding up the workflow. The speed of production of goods, scientific developments, etc. could seriously increase. But today's system doesn't need a lot of goods, it needs deficit to keep prices down. In the case of "AI", the system continues to work as before, looking for ways to make short-term profits by reducing staff. She doesn't need art, she needs a product. And I system doesn't care that a couple of decades will pass, and there will be almost no artists left. The main thing is to benefit today, and it doesn't matter what happens tomorrow.
youtube
Viral AI Reaction
2025-04-05T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwX6y75YCLzaOBf0k94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMUm_WruWAZ_GIVGB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlmNunsIme73Zta9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkXf1n8KErXb8meM14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxsSBJ4lAye_VwL9Ap4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwxCNC2TwSU2jWt-Z94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwG4rbBvagYTF_6D8d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgycLIheFsRRHgIUjQJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-L3JtQvRNe7w_B994AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgykjdLCd_1dOcuI4-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]