Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine taking somoens food, feeding it to a cow, having that cow shit out some …
ytc_UgwM-1sEN…
G
ngl Im kinda dumb like that but why do people hate Ai generated art (Im literall…
ytc_UgxFiz98H…
G
He was murdered there's no two ways about it the interview with Open A.i owner h…
ytc_UgwFo4lkW…
G
I think that video was missing an important point about autonomous weapons: They…
ytc_UgwfBctg4…
G
Autopilot != Full Self Driving.
The Full Self-Driving software actually watches…
ytr_UgySpdb-A…
G
I predict an arms race between AI poisoning software and AI scrapers. Nightshade…
ytc_UgzvvQniu…
G
Just going to add that they are thinking of Ai governments . Look it up…
ytr_Ugwk8QPLX…
G
Sometimes I'm happy that I finished university before LLMs become a mainstream. …
ytr_UgyAuYVb_…
Comment
imagine building a robot that is as intelligent to its creator, yet its mind can work what, 100, 200, a million times faster, what would take years for a human to solve, it could take much much less for AI. And when they become more vastly intelligent than us, and its goals become different from us, even if we are treated as their creator, we would be like insects in their construction site. Also don't forget, AI will be highly coveted, like nuclear technology is today.
youtube
AI Moral Status
2017-03-26T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgywSWFaUO62WmIow254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWLUfO3T59NOQI49Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBaYy4u9QKjRZDi494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFl2iqPI3vDaryZfJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-x_0c2Ukqx6ea1FB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzsJQDO3apYlha7yHJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZcqNSyZFSho8VRaJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughyo8YeCn9ePHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiiEZ1wRuH6OHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghHUTpBsjUQl3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]