Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm sympathetic to people who are opposed to AI in some sense but I think there'…
ytc_UgwEvm8w2…
G
By giving a robot the ability to feel pain, you are essentially MAKING IT BE IN …
ytc_UgjXCxJaU…
G
You're technically not even stealing the prompt since you're drawing something t…
ytc_UgxX_xVxS…
G
I think it's called The No Cloning Theorem. AI art is a bad copy of real art, bu…
ytr_Ugxpb6Qqa…
G
Noooooo, the music had me ❤
Bonus points if you know where it is from ❤…
ytc_UgxNQGqTs…
G
Care to propose any actual jobs that humans could do in the age of AI that can't…
rdc_kif61m4
G
He's jealous users don't love anymore giving their data away to the US in exchan…
rdc_m9gf5ov
G
Add chat gpt to a Tesla robot,throw on a suit. The. You can have a army of Tesla…
ytc_UgzpgzT3P…
Comment
Legit question, how could AI eliminate all humans if it needs a power source and there are no humans to dig coal, fix a broken power line or fix an oil well? When he says "build new hardware" who is "building the new hardware"? Who or what fixes a processing center roof if a tornado tears off the roof? Nature ALWAYS bats last. Im very curious about the reality of this AI doom scape.
youtube
AI Governance
2025-06-21T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzN0bs51G41rymD7YN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwW5gQ13qPJ063wDPd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8uoOudT8_PC0gmZV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzw-KSz-5uyRuh0FBV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy89X0LCi34xRqU9v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM-GnfqnywXA7B_Tp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzE8KCg9pg7SfV0Qhd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbsMzPFQs3QSQTIPR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyPBan2fxFW_WhDbMt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwaUQUSWgmUG_OAdZd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]