Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Art isnt just about the final product, its also about the work it takes to get t…
ytr_UgxQ_tTZG…
G
AI is being used for a genocide NOW! What are you talking about in the future? A…
ytc_UgxW5Qt8H…
G
Yet another stupid pathetic rubbish unwanted hated crap AI video.the uploader ma…
ytc_UgwJgixoS…
G
@someone-ji2zb Proof of any actual AI "Artists" which are actually interesting o…
ytr_UgxHDfB0z…
G
Tried several AI tools, but I chose Pneumatic Workflow for its human-in-the-loop…
ytc_Ugz7cB6Ry…
G
Ai isnt just a tool. You can't call the user of ai a creator just because the ha…
ytc_UgyDesh_m…
G
@JustAGuy_AtYT most likely nobody, these ai companies tend to be chaotic with wh…
ytr_UgysD9p3x…
G
Even if they did a little effort, the AI itself still looks soulless for me. 💀
…
ytc_Ugys0KTnD…
Comment
Human civilization consists of individual humans. Some of their decisions are wise and ethical, but great majority of human decisions are dictated by evolutionary randomness and intellectual short-sightedness. Still, our civilization is organized into institutions that are not as stupid as a random sample of individuals. Humans are currently crucial at teaching and testing AI ethics. Once AI is better at ethics than AI-less part of humanity as a whole, I do not see a reason why we should remain much more valuable than gorillas.
youtube
AI Governance
2025-12-10T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOJV2tBuwHtVW04ft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgwBO7nHKh19jrnHqTZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUWAa0kQDBr4n_r_94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx-j8oIbBt7y6qbyol4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwHM7-46f0mdspqQ994AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlocDupn-kUK_ZRhd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2Sks2eioeIZvULnB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzGDer4ewj9rBn5hj54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2GcL9O8E5IytzG9h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyX6rXQCcRBysgEW-Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]