Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The ppl that think they will gain Power or keep power are stupidly thinking that…
ytc_UgyDKP8je…
G
Honestly Ai is weird... Personally I consider myself a 3d Artist, Indie Dev and …
ytc_UgwOpuibe…
G
Warnings aren’t predictions of doom—they’re signals to prepare. AI won’t replace…
ytc_UgyjMjvy5…
G
I feel like AI artists dont count as artists. First off: you didnt do anything t…
ytc_Ugzz7xa7t…
G
Difference between AI and other tech innovations is that others were just tools …
ytc_Ugw884Q6r…
G
The thing that alarmed me the most, is whether there was intentionally hidden me…
ytc_UgwA3bROB…
G
Thank you for this info! I want to do masters in AI ethics after my bachelors in…
ytc_UgxczKlH8…
G
General intelligence, like humans (though probably soon far superior through sel…
ytr_Ugy4E2Y3N…
Comment
I think the Fermi Paradox and AI are closely linked.
AI might be the biggest wall that need to be broken for a civilization to reach other Star systems and expand.
But AI itself doesnt need. Creators need it. AI needs energy source. Thats why I think when civilizations fail to control it (because they seek work automation) the AI simply builds a Dyson Sphere around their closest Star, locking that infinity energy source for itself and making it vanish from the night sky
youtube
AI Moral Status
2026-01-18T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgymWwHngHj1eSjKt1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZ0z_3uoi70aGCrEt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwse_L3mOImsePev7x4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyXfj3SUEFISZE0rTJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIstEO2jbRfSR1sYN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwmGa9A9yM4BYVFTnd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwaS5QJb2NvZSLFKwx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxoW_CZTjltSBTTjSF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy53B3WvgywWVmnkKd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFJGqCWLeRALnRNGd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]