Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is completely irrelevant. It's not like companies are replacing all the wor…
ytc_Ugyte8-PQ…
G
The hair is how I can tell— AI isn’t alwaus the best as making hair flow natural…
ytc_UgyC2SxPN…
G
I think that if you review how the human mind reacts when in a coma , you will u…
ytc_Ugxxl0RbG…
G
Everyone wanna knock AI but when you ask WebMD something your automatically dyin…
ytc_UgyhH8I5r…
G
on highways I trust AI driving more than I trust a human driving.
it is funny ho…
ytc_UgxDfGbcZ…
G
Ai being biased??? Wtf its a damn computer a program. THE TRUTH!!!
Stopping is …
ytc_UgwO-Tmfr…
G
LLMs are amazing: they allow us to engage in conversations that would normally r…
ytc_Ugx3oiiMU…
G
All of you people love ai but as soon as it says something factual you don't lik…
ytc_UgyFczekw…
Comment
This is the dumbest thing to be afraid of. There are so many real problems with AI, from your boss replacing you with an AI that can't actually do your job to the environmental impact of all these data centers. This type of critique is really AI hype in disguise... "This tech is just SO powerful it could end humanity as we know it...." is just another way of telling investors to put more money in it.
youtube
AI Governance
2025-10-15T12:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwtB5eVQy7lPIc734x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyB3uNCIehNCPHFOt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxMq5-XX6H6tldZWIN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziqkC0YbHhU5hOuQJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDi-5UnYK-EgNK6xp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKn-gld_zenq4huw94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8kFyjDinu74qFJDd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_MASt11VvovOmhJB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIM0GnXuwVkN65Ln14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyD0BTu0ysPX4hosyp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]