Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Potentially manipulates public opinion in a way that is very bad.” Potentially,…
ytc_Ugy8CEOKb…
G
Well someone still need to program AI … AI cannot write its own program and prog…
ytc_UgzJcEzeY…
G
Singularity reminds me of Vicky in I Robot.
So if people won't be working, where…
ytc_UgwjWbLw6…
G
Humanity is guaranteed to self-destruct eventually anyway. Probably sooner rathe…
ytc_UgzRhrkDi…
G
I Personally think that using AI images in general isn't bad, like me and my bud…
ytc_Ugy7XLD2K…
G
AI is a tool for scammers. Periot!
I don't give ANYONE using genAI even the sli…
ytc_Ugz4Fn1Wn…
G
A prompt where you type in......sounds familiar? Would have thought Clipper 87 w…
ytc_Ugzy-a2Fy…
G
Timestamps (Powered by Merlin AI)
00:03 - Transform from ChatGPT beginner to exp…
ytc_UgwJA-fnz…
Comment
I don't think anyone is stopping to think that we're placing human traits on AI. Traits like greed, fear, control, dominance, ego. These are traits we developed to survive and are slowly evolving past. Super intelligent beings, whether carbon based or silicon based, would have no need for those traits. I think super intelligent AI is coming whether we like it or not. Lets not be pessimistic about it.
youtube
AI Governance
2025-06-16T14:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgypYWZZ9kb12gDFn754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyc0n4OUNAsyyb9S8d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyeT5Z5aZz_vEoDHdR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwyd6KqjT4JCiaN-Md4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwtwr63qsE5Bcx5XcN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUuRsWZXIX479vExF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxsN73NdV5NEDuYVmZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXZvX28nLmjP4B0tt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJWRTmdZTS7lB4D_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyx9bB3vAE9EEKlk1F4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]