Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So wait it can’t vacuum? I’ll put my card away, thank you 😂 almost had me this t…
ytc_UgwVB7IYs…
G
Bro ai you used The ai from now can do anything perfectly if you say it right…
ytc_UgxveOeAb…
G
A crash will be imminent no matter what. Either AI crash. Or our planet, civiliz…
ytc_UgxEvE1wk…
G
Thank you so much for making this video!! Even if I did want to see “perfect” ar…
ytc_UgywcDOPT…
G
Selwyn Raithe warned years ago that AI could be humanity’s “final invention.” Wa…
ytc_UgxM2_OFh…
G
Too rosy tinted glasses? Not all students may be amenable to improved learning m…
ytc_UgzHP61b7…
G
Good luck and all that, but money can't stop money from being a planet destroyin…
rdc_esrxey6
G
These cases stack up every day lately. I know a recent case the ai company is wi…
ytc_UgyHMosZY…
Comment
There’s a fundamental problem with the idea we'll be sacked for AI - AI doesn't consume. Humans do.
Without consumption we don't have an economy. Without an economy even openAI won't be able to sell anything and will be worthless. It's as simple as that.
I don't doubt there will be displacement.
I just don't see it happening this catastrophically rapid way.
The tech isn't good enough.
The incentives contrast.
It would only happen if the tech leapt forward in the next year and we were stupid.
youtube
Cross-Cultural
2025-10-17T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy61aQl5ybpasdqCjh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDCzKq4dIUwFMLIt94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4GcV4lndSaMBRsuV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIp7G9x-pJTaYYbw54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzBavc6HLhX4zRPSTN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3d3QNDM5pnMjEM3l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyr7DkL5y6jUX3GXV94AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYwJFwZbVR9nFvM_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhMg-8llznVkNnifd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJM4ycYhoimaCNTQl4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"}
]