Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its just sick how these companies think they can get away by stealing art and us…
ytc_UgwiqeNGy…
G
@Air Ick The analogy only doesn’t work if they’re writing down what is said verb…
ytr_UgzOBgp3V…
G
> *AI cant create art that has yet to exist.*
Eh? I mean, if you have tons, and…
ytr_UgwwS7vsH…
G
1:03:00 When we're angry, it's like put prism infront of robot to make decisions…
ytc_UgwKuIYp4…
G
Blue collar jobs will eventually be taken by artificial intelligence because the…
ytc_Ugz9TItLQ…
G
That’s the problem. It eliminates the very thing humanity has been known for, th…
ytr_UgzrJM7CF…
G
This is exactly why I busted my ass learning programming for 4 years, pivoting t…
ytc_UgwwxDURO…
G
Agreed about self driving cars not being the complete solution for cities, but o…
ytc_UgxWZC46R…
Comment
I simple dont care because not all therapists, doctors or any experts are honest too
Many of them lied like how chatGPT doing it
By selling peoples emotions and information along with the dreams that often people mislead them to believe they could be something extraordinary
youtube
AI Moral Status
2025-08-27T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzhr5YiABoG43VH3st4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb7yOgmKRYbyqtYZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxrYrmW-_5_bzZ3GhN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv827oB3mGGRYCsPZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4m2gpG_JAu7EawZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzR32nZ9h536SKCenJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3irVGwQOBvRW7qYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ28DKC06ddlWByMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2ODPQ0c5D19iFcOB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyXK7Uxa_8P5XU5_gN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]