Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey guys…do you think that maybe AI saying that it’s going to cause an extinctio…
ytc_UgzVFu3Sq…
G
I feel like he's pulling a wizard of oz, he's asking the robots a question and s…
ytc_UgxyJHNvd…
G
AI could do news reporter job...let's stop this before we regret what we can cle…
ytc_UgxAojlz4…
G
"guns don't kill humans, humans kill humans" this applies to ai too. Everyone is…
ytc_Ugz2K20x6…
G
You are so wrong AI doesn't take that much time to develop, it's gonna surprise …
ytr_Ugw4ScTWR…
G
If AI takes away all of them, every person would live without ever needing to wo…
ytc_Ugzk33O6n…
G
Nothing we do is green. Some forms produce Co2 all the time, the others includin…
rdc_eudowb7
G
@DrieStone problem with the Tesla stats is the whole controversy about it disab…
ytr_UgzRBHtsW…
Comment
The data centers ruin the landscape and use a whole lot of water and energy. Why use it when you can just *talk to a person*, which is environmentally friendly, better for privacy, not making rich assholes richer and actually beneficial to your health?
And no, AI isn't your friend. It's a computer program. Fleeing into a fantasy world is avoidant behaviour, not a solution. Talk to people. Get a therapist or find a group of people or talk to a friend. The quality of your relationships are an important predictor of long term health. Talking to friends and touch are actually lowering stress. You can't get that from a machine.
youtube
AI Moral Status
2025-08-05T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxciC4iVmHqrhGOXqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz93kNKwXxv8M43wF94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxziwlj8nC7VIZzjah4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz396nckD3f1HgEMeN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWLmKFE7Pw6Wfn8EJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyy-MqsHisbkR1o13p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7pz_rIggPCyTnvvN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxgbvcLhJ3POfu-zPZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz93dXPlmKsw1JE1Zd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxjBAtrojMrJRqwLVV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]