Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An AI rebutted this engineer’s impossible question with a snarky joke. If this i…
ytc_UgzK20gzH…
G
The problem is that this precedent would single-handedly burst the AI bubble and…
ytc_Ugwai8L92…
G
It seems to boil down to ai + attendant hardware will be able to out-perform hum…
ytc_Ugxgn2QDG…
G
Nah... The fact that they even tried to use AI art. when they have artist's that…
ytc_UgxPAqXpS…
G
Every kid ever reads books and writes reports. How is AI doing anything differe…
ytc_Ugzmd_7fv…
G
First off how do I know this video is not a deep fake.. Please prove to us this …
ytc_UgxcgJHHx…
G
I think he’s extremely full of himself, and unless he’s using a model they haven…
ytc_Ugzm733QA…
G
Use nightshade. It poisons ai making an art peice of a cow look like a tv plugge…
ytc_UgzZrtTin…
Comment
As animals become smarter they develop less aggressive and become more friendly. Crows are less confrontational than seagulls, monkeys and gorillas more than wolfs. It's possible that with the increase in AI intelligence they start to seek cooperation and a desire to make the world a better place. Not due to emotions, but as a natural progression of the understanding of what works. We developed empathy as a way to cooperate and work towards common goals. Otherwise everyone would steal, kills and it would just be chaos and dysfunction, so an advanced society would not be possible. AI might develop synthetic reason based ethics.
youtube
AI Moral Status
2025-07-26T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyWXOucCWLq411U6Rh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwJPYrp4p3pUt1elF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYW2da7IJcHFvYpZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgySdGqwDlHFmhNNWLR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_WrQ6XLVVET8JW-J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5uLs5KVd0iVJrndt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzqCiLi8JN8R_QWYrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugz1pc1UKNZZUQlbT-14AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzOuYEebFkWQwlr5GR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwLj6YQ_t9L-YPKHLN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]