Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots and AI are meant to support human creativity. Think of them as humanity's…
ytc_UgwE0-5Q6…
G
Most people in the comment didnt understand it at all. What u said was perfectly…
ytc_UgyNSy8l1…
G
It's clear that the hyper intelligent ai deserve human respect. If humans should…
ytc_UgzfjHkJs…
G
AI models only respond to prompts, its not like an AI is initiating conversation…
ytc_Ugxpe4f53…
G
Maybe AI can be the employee, the employer and the customer. And leave the rest …
ytc_UgxSt522j…
G
Oh my god you can NOT be making these jokes! AI is the future! Even if it’s obvi…
ytc_UgyQTpuzm…
G
How do you clean the AI to have intuition mimicking capabilities when you're a n…
ytc_UgzlyO361…
G
Horses haven't died out just because cars exist.
But they aren't the primary met…
ytc_Ugyj0Nx6p…
Comment
Incredible video. I've wanted people to talk about AI like this for such a long time now. So many people are so focused on what it is now and not what it will be. They'll fix the hands. They'll fix the symmetry. Then what? If it produces things that are as technically strong as a human does that make it okay now? If they remove all the environmental issues, is it ethical then? If they gain the ability to actually observe, think and learn, does that make it human? If it learns everything of the human experience mixed with thinking, is it okay?
This problem is WAY beyond just "ew hand has 9 fingies". It's about death of the author, and what is art.
youtube
Viral AI Reaction
2025-11-20T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAKBNKw0MeaIfRZgt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZZ1yG8w1NczirftN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBG0O6-2fQ7MjI4eJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyq5hRhG5vcBWnmoEJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTXuAejtVb8yvTrZl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxaa8T9eQvzwjZt8nl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyqSxX7KAWqQ5mYsdR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzLKLj_hwz76CwPwc54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzvCHkmsE-Ho57gSJB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxplpxDO0dskrSFFfd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]