Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I thought you were talking about ai art and stuff like that but they shouldnt be…
ytc_Ugykz32Fa…
G
Wait so true I’m gonna go borrow some ai slop someone generated. Remember gang, …
ytc_UgzDAhMAu…
G
.... my name is spohia um well its just a robot Who want destoy humans…
ytc_UgwTEWkYa…
G
so digital art is like 50/50 ai and human art because you only need a human brai…
ytc_Ugyud3xop…
G
Dawkins is right, but the rest was pretty terrible. We should not be making robo…
ytc_UgyNRt541…
G
The first time one of those self-driving trucks kills a family the family of the…
ytc_UgyXefvjQ…
G
Like any new technology we should allow AI to grow freely and if it goes off tra…
ytc_UgxQi3_hZ…
G
AI is another tool in the carpenters toolbox, it doesn’t replace the carpenter.
…
ytc_UgzWk32DX…
Comment
AI increases the ROI of a person using it, people should be hiring MORE people not less. This is the time to cash in not cash out. AI hallucinates a lot so you don't want it unsupervised and just replace everyone. This is the era of big dreams and projects and riches. Everyone is missing this opportunity so I guess more for me. Hire people to use AI and maximize the exponential rise to riches. Dream BIG.
youtube
AI Moral Status
2025-11-07T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzl9qCl5FDr1AMZJSZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWcdATniuCKK78SYJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxb3M94IM7_i-UZ1A94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyEiE-N1Aot_-H0m8Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzz52KkW_UL3-JKpg14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0NzVGKiI1g5KkdqJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7svI4rquMj-K8SFp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqBWaCr74ipTqUaBF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyI5aYrXREyG8qUKv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxP2SS2P-pOaE0hRbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]