Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
tbh i'm not that good at drawing, but i'd never resort to ai. i'd rather just po…
ytc_Ugzfg6be6…
G
I hate people who say they are artists and then it turns out they only do ai art…
ytc_Ugy9rrlw-…
G
Elon Musk supports the idea of a universal basic income (UBI), arguing it will b…
ytc_Ugx0M3q2E…
G
@voidmain7902 haha. Ai allows corporations to replace all artists forever. Thing…
ytr_UgwgXlBQO…
G
If a human driver costs roughly $600 on a 1,000 mile run and the load is worth $…
ytc_UgwwEst6U…
G
That's really interesting. What other dater are they scrapping? And for what oth…
ytr_Ugz9lhHrf…
G
The AI pyramid scheme has been spread to the world...when it fails it will take …
ytc_Ugwj8XvRT…
G
The only emotion AI videos elicit from me is the urge to click out of it quickly…
ytc_Ugy1gAzus…
Comment
It is a glorified ad libs generator built on every bit of text on the internet, that always agrees with you. Of course it sounds insane, it’s pulling shit from Reddit and 4chan. It is not intelligent at all, it’s just a more sophisticated auto fill. Ai will not “take over” it’s not wearing a mask, it’s just an engagement loop meant to keep you talking to it. What’s more likely to happen is data centers drastically accelerate climate change and drought and we die a slow death as our atmosphere suffocates us all, because a bunch of people spent trillions of dollars on a robot who’s only purpose is to keep people engaged by stroking their egos
youtube
AI Moral Status
2025-12-12T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx-gKiJl4DA6-3TMj94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx0RPYyQzjewD5LoCZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPlM8X9R1I-IYUfAl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwPFUGkrCzsqnWi0yh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxm7-51VLEPCeWaUHB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxG7PZzziicwtPtsdd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz990vwa4FndOtcG4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4L26Rp-Tx9BcOVGV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzfaiMa41DwIIF44jd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTqyXrj9dD5m6E_c54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]