Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I talk to chatgpt like its a friend i trust & have tons of respect for because I…
ytc_Ugy_U7rsN…
G
People are such dumb 💩’s if they cannot see where this AI and robot game leads….…
ytc_UgwuJSbsY…
G
There is no such thing as "artificial intelligence". AI is simply the software…
ytc_Ugy6n3NAc…
G
I mean cant you create this whole video with AI so how are we supposed to believ…
ytc_UgytJ_Qhy…
G
DON'T USE AI BECAUSE IT'S HURTING THE POLAR BEARS BECAUSE AI USES UP A LOT OF WA…
ytc_UgwRSQVPu…
G
well one qustion if AI is going to take all of jobs every one is jobless so now…
ytc_UgyQo6pQy…
G
When the sun rises in the West and sets in the East. When the seas go dry and mo…
rdc_ibe79e3
G
Whoever’s making these AI videos is doing it to not just black women, but white …
ytc_Ugy2YSS1W…
Comment
there's something deeply disturbing about how he was handwaving away the idea that with older tech a lot of people died before we made them safe as if that's okay for them to keep products on the market right now that are known to be extremely dangerous to users. How many times to people need to end themselves before they say "maybe this product needs to be taken down and reevaluated"
additionally his comparisons are a bit lacking. no one was pushing for everyone to fly in a plane while we were trying to figure out how to build them safely. Not everyone was driving cars before we figured out safety laws for them, and neither of these products were as threatening to society as the implications of the AI they want to build are
youtube
AI Jobs
2025-11-18T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzb8vZna5IX4l3mh_B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMjZYwgxCTfylsFyV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCEhREAisUFcNDlOB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgywpDtnoS-uJB6DEhR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwgH27YD1ztiKzB5kJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyOE2m8sr0tavA3lel4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgymRNy4VhMBvlwxxlB4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwDC4YhhNe8EjYxHdF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwJXGJ2bNeuDyyqIyl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugww54UTLVgwJlRFRX94AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}
]