Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
omg the amount of times I look something up on google and the ai is so completel…
ytc_Ugx29VNSL…
G
lil bro artists cry about ai so much more than ai enjoyers ever do lmfao 😭🙏 your…
ytr_Ugwra3Mmk…
G
I tried to have it give me a pun for a birthday card involving plant diseases fo…
rdc_jhed6jf
G
I don’t understand people who work on AI I genuinely don’t see why you would mak…
ytc_UgyD945JD…
G
@sorayaimperial I think what you say shows that rear light area is an issue too.…
ytr_UgxNDzcNR…
G
Yall find a robot gonna scrub toilets an make beds 😂let me know I'll hire one…
ytc_UgxbwU1hz…
G
But you are jealous. Writing a ton about and making videos of other people using…
ytr_Ugyb9NQHo…
G
That's a great aspiration! Becoming a scientist involves dedication, hard work, …
ytr_UgzSJX85V…
Comment
I'm on board with the idea that a super ai's preferences are likely to not be predicted by us. But while I think there's a reasonable chance that those preferences then cause it to do terrible things (not necessarily world ending), but I no way see that as a gaurantee. The preferences could also be relatively harmless. With chatbots we've been talking about unintentional behaviors like hallucinating and flattery that are cause problems. But we've also witnessed preferences that really don't seem to matter, like preferences for certain words.
youtube
AI Moral Status
2025-11-05T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgztBbiG85c-1g2F0xB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgylSPOKDURX-zUOnoB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgzdTSPqGcLA4aaEliZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzJ55MxLLBgB0gLFAV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzxPXBbQ5FhGPkc1UN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz3AoAH-hITwsj4_bJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwxQYCSYHAlHvq3wUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwYv7samR-oMnjN3Ll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxVQVka6UbHPNDRRd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyuVlEGR0Zaus7p0wF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]