Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Terminator 2 always frightened me....even as a kid, I knew too much AI will …
ytc_UgwvF57Fi…
G
i not enjoy this.
bad juju
openai will have safety parameters in place for imme…
ytc_UgyPcEfGP…
G
I’m a software engineer and work a lot with ai tools for the past 7 months. I ca…
ytc_UgzyX4v3V…
G
Graphic Design isn't dying, it's evolving. The difference from when i started 3…
ytc_Ugx0EEjUQ…
G
A nuclear war, triggered by AI would be a suicide move; the EMPs would shut it d…
ytc_UgxZ9ngCu…
G
Ai is great at applying pre-known algorythms to the problem. Ai code tends to be…
ytc_UgzBOXooJ…
G
It is perhaps no surprise that AI should want to stay alive and would be willing…
ytc_Ugz4pMdFV…
G
Artificial intelligence can NOT reason. That is false. It can sift through immen…
ytc_UgwmTRylv…
Comment
It boils down to the same thing we always ask, do we make it illegal for machines to replace human jobs. So far that answer is no. One small difference is. If no human comes up with new content to train this AI, it stagnates and can’t function. The same can’t be said with typical automation like factory robots.
youtube
AI Responsibility
2023-01-05T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZ_hOEXI7g52j9IDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx3hqdpRJLZb7nXdsp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRprpYvVfshZjfhDt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyN-2-YZIG3ID0M8gF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwbhvuciQC_S-3IzY14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz2yDZIgTnUtp3z-yN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxlK3XJy7s3BBKB90p4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwP0mDEs5ENOb8brc54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzqeEEqPHpBlyrA8vh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwX6qclVx6KRnzaG14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]