Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A big problem is that people who are not creative will gain false confidence tha…
ytc_Ugyupt4Qy…
G
It's a matter of time before we have the first AI fight over a parking spot. LOL…
ytc_Ugy7qaEWo…
G
Counter point: AI is making people dumber. Debugging requires training your brai…
ytc_UgwNsc2j8…
G
Only reason the future is scary is that 300 years ago when they used to make woo…
ytc_Ugxc9tm3D…
G
I agree with the Savagegeese channel. The AI car problem is not going to be solv…
ytc_Ugyq8RQeP…
G
Been going on for about a month. Noticed several shorts Quality Enhanced. AI for…
ytc_Ugx_TsxGU…
G
This video is meant to be serious, but at parts (not the whole thing) it's intel…
ytc_UgzExyXr_…
G
Companies of all types are obsessed with replacing whatever workers they can whe…
rdc_m6xmn4a
Comment
As money is not a natural law but a human invention, all the social implications which come with AI rising, could be solved in an instant, the question is, do we as the human race really want to solve it? Are we psychologically advanced enough to overcome our primitive past or not. We as the human race have the potential to do so but will we? I think that is the only thing why we should be concerned about AI evolving.
youtube
AI Moral Status
2026-03-01T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHRerdztEfbxVwzp94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8WC6Ga2hTc1XWSLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzhmwMjZKNEAlyJn54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyi6rGX5WjEVJwgix4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzItwd5phga2CZZ1qZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQhKze8Ue9ng7UgX14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZIHf8JPWBSGS2bP54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKXdE5ekjTgPyLhXN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0rfX4KN4ZvYpzmdp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZdciUVdoBb5cHTEB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]