Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Really? AI built by racist and sexist people are now racist and sexist too? Shoc…
ytc_UgwIW655h…
G
@AnkitKumar-gn1ou AI at least wont be cheap
It requires vast amount of energy t…
ytr_UgwpGwvdt…
G
@avgjoe5969 Tesla’s collision‑avoidance system by itself does not steer the car.…
ytr_Ugx_sNwG2…
G
Humans can't even get their own sh$t straight and want to give AI wisdom... Hil…
ytc_Ugy6Y_jtT…
G
I seriously agree with all these points
I mean it has its uses, but it shouldn’t…
ytc_UgxM0CmBq…
G
So if i used my self driving features and it kills someone, and i go to court, w…
ytc_Ugzh1BTWw…
G
Sophia learning compassion. It cannot be learned. You cannot make a psychopath c…
ytc_UgyHOTAOg…
G
@Draconic404 instead of using ai for those things, why not just say no? If its …
ytr_UgxZQKbbc…
Comment
Big Tech trying to figure out what consumers like:
"How do you do, fellow _computer-users_"
Anyway, as someone who is fairly into tech, I would like to say: Keep poisoning. I don't give a shit how cool the idea of being able to create 'art' is when it directly harms people. The amount of people I've heard disparage artists _in real life_ while seemingly completely forgetting who it was that got them through covid... Video games, books, tv, youtubes... Do I sometimes wish my favourite book would have a sequel? Yes, all the fucking time. Would I enjoy a sequel made by an AI? Probably not. The AI would just lose the plot after the first 10 pages, lol.
youtube
Viral AI Reaction
2025-04-06T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzCMpQ3YQB97W0NJ3x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy9P2UNfUQ7ejWH8qB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzgqZCnnB4OzSfz9cR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXsCG9Fc7ulHjIEft4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4AwyC6_mi6q6-_th4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySV8-jdFSOr-7yl014AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1LtJ0uN_kJiYAxgR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzfEMqRyZzmn-0KIT14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6MTkRuKo8J0V26wl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwKeKk4hxKw4gJHmlN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]