Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually work for an art gallery. We recently added an Anti-AI clause to our a…
ytc_Ugxtbdy9i…
G
People gotta realize that AI is like a baby (or mirror): they act based on what …
ytc_UgwPWBa2E…
G
Personal general purpose AI assistant a la Chat GPT. It will pre-screen emails,…
rdc_j1xinqu
G
I think Reddit is perfect example of how to use AI - consensus/community learnin…
ytc_UgwSLzA3u…
G
Okay, here's a random comment on a random YouTube video that will appear crazy b…
ytc_UgzIIA2jT…
G
The main fear in regards to AI is the human reliance to AI. When a new technolog…
ytc_UgwbmmNZf…
G
😮 Disney made a deal with openai😊 AI is not a bubble it's a Manhattan project😊 c…
ytc_Ugw0_n2H4…
G
Tim, you are not even close to an expert on AI, please stop talking in terms of …
ytc_UgxStI5aB…
Comment
I disagree. Once most people lose their jobs, the concept of "money" will rapidly change its meaning. So competition to generate more wealth won’t matter in the same way it does today, because if AI can produce abundance without human input, then clinging to outdated systems like wage-based economies becomes pointless. Instead, we’ll be forced to reimagine value, purpose, and distribution. It’s not the end of humanity.
youtube
AI Moral Status
2025-07-01T05:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyg10bTuFC7osW6pwZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymieBgfWROVpC0DGt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzsYsTVUzz0D9sjTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzNQxJJWTYJTy8pEdB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxt5Bt6sHT68gAES0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxDTP0DDmkIv7K9MHp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxfDnPM8xmGvcg1aaR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxONx_4BOXD2bM9wfN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxioKzRzykGqjRa0Ox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXs-5V1905ubVJ_Qt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]