Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So we just gonna continue to let AI take everyone's jobs and we'll all be homele…
ytc_UgxAdl_mh…
G
Mark my words, FSD is something we can do today and your tesla will make you 30k…
ytc_UgybXsB_V…
G
Although this is a hypothetical scenario, it is too hypothetical to really be wo…
ytc_UghilDXtR…
G
I will hire 15 such robots at my job, at home I will get a female robot, fantast…
ytc_UgxOmCH-j…
G
Companies:
Fix the actual problems that customers are calling about? No
Use ai …
ytc_Ugy-yRiY3…
G
they should make companies pay the same amount of tax on AI as the employees did…
ytc_Ugz-mdPxy…
G
I’d also add there is an economic aspect that all these businesses are not consi…
rdc_mzyc76u
G
How about we have AI replace these souless, greedy cock suckers at the executive…
ytc_Ugyg-C4cM…
Comment
I'm sorry but dr. Neil DeGrasse Tyson stops when comes to medicine but then rolls right on when it comes to AI safety contradicting AI researchers. AI alignment is not solved. AI safety is still struggling. And that's even before you address perverse incentives of capitalistic pressures on that development. And how that exponential growth is really not the right model given limited data while mentioning that about AI generating something new despite that being an actual possibility because AI doesn't only interpolate it can also extrapolate. Will it be good ... not now and likely not for a long time but at some point based on previous behaviour and data it may build coherent enough model of a person to extrapolate quite well.
youtube
AI Moral Status
2025-12-27T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgmYNz6dGKIANqelV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxEFBq3icK9DpFgqF94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyhgWWRmytnZkqQ5EJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-JSf5vA0qn863SnN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMQWrXmWpTIH4b6yR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3SZ8T98sdWPaVoOh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQIY4RUrAGKNKC1vh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx_aJNFggqB0G66m2h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxmoqSGt9Tl3o63zNF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhpsPoW2aaMaZHlzN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"}
]