Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the blanket shaming/hate towards A.i makes it harder to push for ethical…
ytc_UgxuBbRXk…
G
Bernie you're done brother. Ai is going to be a top down reconfiguration. The hi…
ytc_UgwAsR6s8…
G
Explore Economics with Artificial Intelligence:
1. Ask an AI when PhD economist…
ytc_UgzqY99AA…
G
Hey he is right cause I am a dev and I use godot and if u try to just copy and p…
ytc_UgxRYpkAu…
G
It is not an autopilot but a drivers aid... but Tesla for some reason is allowed…
ytc_UgyujZLiK…
G
Excellent video. Im glad you're putting a spotlight on Stability AI and LAION: t…
ytc_Ugz7XzvS8…
G
Employ an AI legal team.
Or get real and realise that even people do this any…
ytc_UgxozpvB2…
G
I absolutely find nothing good in Ai.. its of no use.. a human creation is eccen…
ytc_UgweDWw3b…
Comment
The super AI might destroy us, or it might have the same interest in controlling us/destroying us as we would a random colony of ants.
We could destroy it, we could investigate ant politics, and figure out everything about ants....or we could spend our energy doing 1 billion other things.
The super AI may have bigger fish to fry than worrying about why some barely conscious organic life forms hate each other because of a protein expressed on the exterior of their skin, making some of them different "colors".
It could not only find the idea of "color" irrelevant in general, but the entire sociology and culture we have created over centuries. It could completely and unequivocally not care about us at all the same way we don't care about the nitrogen molecules we breathe.
youtube
AI Moral Status
2025-11-01T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxpz7mBcwu2pU7krIB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwskrXu9Gvv0qJUaKt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvyL5MNpNoJn58MNR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9zcGzFMuHbitH_KR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw7TnR-xOvM4ryZ4514AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxpYHXahpQ6MMLDHxB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIOD3BIRkx6ODHoSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxi7AFVI3Mslat1Z954AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyuwUqi288wQwjicY94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx520150TNoWIH6Wqh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]