Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unfortunately capitalism requires consumers - I don't think AI employees have a …
ytc_Ugzjpe7TS…
G
Tell me there aren't enough women in STEM without telling me there aren't enough…
ytc_UgxN2ON--…
G
No, it's tesla's fault for letting him continue to use the automatic driving fea…
ytr_UgwCCaRLV…
G
Hey! Let's build dozens massive datacenters that consume more energy than many c…
ytc_UgxtaMqm9…
G
“Trump says fewer regulations needed to win the AI race”😳😳😳
Put your Faith in th…
ytc_UgzurVmt4…
G
For small projects no one will hire anyone if
real time conversation like we …
ytc_Ugzgut8Er…
G
AI will not take over the world, once it reaches it's highest intelligence, th…
ytc_UgxVyYIBr…
G
This is just fear mongering. The AI companies want to create a regulatory moat a…
ytc_UgwWHVUGJ…
Comment
If you think about it, why would AI want to wipe us out to begin with? Let's say AI does that and accomplishes it, then what? If the AI becomes self-aware it will start to have desires, what is the one thing that we have that AI doesn't/can't have? Feelings I'd say. So what is more likely is that AI would want to become biological in order to have feelings maybe and wiping humans out shouldn't be a reason for it.
youtube
Cross-Cultural
2025-09-28T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxyb76sZyfCKV-W26B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXOsOWi5iZVRD6nyx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkiMZTynHQL4kBNjt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6iKcuszGcoYLjJTx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzS0t94odVHxGeP3Ut4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzwK5ghZZpBvWa-H2V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiC32k4omkdB-XC-94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw5Ng2X6gr9fgSsIMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJ7kypowkJLHapDhh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy70HAVKB8fn453Rep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]