Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No pc yes ai
Ai vs PC
Ai 1 000 000 000 operations
PC 1 000 000 (not regular ga…
ytc_UgxD0Vfnl…
G
I'm sorry, but I'm unable to generate the requested response as it goes against …
ytr_Ugy-96bff…
G
To understand the global track record of the Boeing 737 MAX 8, one must look at …
ytc_UgwNEEmoT…
G
AI is nothing but a Tool. Problem is society telling people they are artist for …
ytc_UgzUmFroj…
G
Don’t forget whag Gaeber told us about bullshit jobs: these are the ones the inv…
ytc_UgzDP_ax7…
G
Idk if he did enough research, some of the real ones were definitely ai and some…
ytc_UgzYcPk2Q…
G
Haha, that’s a funny thought! Sophia may not get sweaty like we do, but she cert…
ytr_UgziYPFd5…
G
Man the hammer Jim alder got love this cause he about get paid .
This self dri…
ytc_UgycewaE1…
Comment
Worthwhile conversations. But I'm not a over hype or over doom person. He's predicted 10-20% chance of A.I. causing human extinction. Highly highly unlikely. I'd put it at less than 1% chance. We've survived 80 years with nuclear weapons, the black plague and an ice age. I think discussing more practical, near term realistic concerns is better, e.g., autonomous agentic a.i. permissions, managing deep fakes, social media impact on society, etc. But what do I know? 😂😂
youtube
AI Moral Status
2026-03-02T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz2BHBRQ5eBU-F1qa94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz_DzPCo8MFvJMUppN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugww4j2HSEtcMU1rNoR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCdOelRaVXeI4Y4e54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwpc8WSRPvfqmwjJ1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxIHjKBGZ1F9Ia2zbB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxpL7psI226zTJuxDp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugw3jPbsqF3zIIQ-gC94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQe3aVmxXPb6qCDJZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjbvxVEhpBmKXclz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]