Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@karnubawaxNo it doesn't and no it isn't. There are physical limitations to com…
ytr_UgxB574rB…
G
Haha, that's a fun way to put it! Sophia does have that humanoid charm, doesn't …
ytr_Ugz6p4IZ9…
G
I feel like there’s nothing wrong using ai art for fun but using it for a busine…
ytc_UgxB_Ymtt…
G
Fascinating interview. And kinda scary, because I can see his point of view on A…
ytc_Ugxqg1MC5…
G
I fully support her ambitions. I say we should improve AI so that they might get…
ytc_Ugwnr35LY…
G
I thought it was the ulpu dude but that's isn't a AI artist it's a AI music arti…
ytc_UgyMoF3_7…
G
The problem isn't AI, it's greed. Instead of allowing people to use AI to be mor…
ytc_UgxtT7inY…
G
According to a channeling conversation between two entities, many humans imply t…
ytc_Ugwvv_0Lv…
Comment
If AI takes all the jobs and humans are all unemployed, there won't be any money for people to buy anything or pay for anything. Do companies that are replacing workers with AI think about who will buy whatever they are selling if no one has a job? I mean, if companies have no customers, they will eventually close, and so there will be no jobs for AI anymore either!
youtube
AI Governance
2026-01-31T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzR2ZOdWfJQ5rlyf8l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw9PCispgH71hkMJ1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylzeksQIF6ejtwn3J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwKwFILmBYPFaS9TIp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyGKoZrCCq77Wtq7bB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyX2ocjv3xL9HpGc3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwQ5-xJ7Pkb29sacld4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHyl_rm_6Hjc3oXa14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyyzNaGW7x3LIE8KrF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_c6u1DWZtgQXkdW54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]