Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
On the point around why aren't more people panicking about this - I think there …
ytc_Ugz_R4N5o…
G
DJI is a chinese company and China is also investing heavily in AI. In a drone w…
ytc_Ugyzw5bwZ…
G
Technically, ChatGPT worked correctly. Had they asked for a description, not a c…
ytc_UgxEKc3p9…
G
Bhaiyaa on personal level i also think that insaan sab kuch karta hai apni yaa d…
ytc_UgxV1unIK…
G
Whole world: AI could help with autonomous driving, home automation
Asian parent…
ytc_Ugy_CZSvr…
G
if it's a joke then why are teachers getting fired? Or coworkers getting fired? …
ytc_Ugz88634A…
G
What do you think about AI in Radiology?
Are students avoiding this branch becau…
ytc_UgzOghM1M…
G
I heard Britain have already declared their intention to leave despite everyone …
rdc_et7ty3g
Comment
I'd just like to say one thing that somehow gets omitted here: Humans don't only perform the job - Humans are on the receiving end of many jobs, services, teaching, enjoying entertainment, using products and so on... So, we also have to take this into an equation. What happens when humans stop being on the receiving end? Will humans want to be entertained by AI, will we want to be taught by AI, do we want a self driving cars, do we want to read books written by AI, have paintings made by AI...
My guess is NO.
So, it's not about being smarter or more capable. It has never been about it. People were never even led by the best people for that matter. We sure won't be led by AI.
youtube
AI Governance
2026-02-16T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwrd52GADACoj2gnY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJOV04jBIhuXSVHn14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxjtc1L3NyOKnW3x4h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyZOU4JeFPG-tBXZNV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9HKagvL3apT_gUM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTHojhjHMHpNVyddt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxBpEf1-SvLMK9u-rR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy9bMX7AnTJjX7VHzh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyxPAgqhbWX7p-DIRx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykoKeYWFPIb4inkAF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]