Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Beautiful video and well pack information, am data management working in a pharm…
ytc_Ugy82la5u…
G
So wat if the robot doesn't give back the gun was next? Robot: shoot the white m…
ytc_UgwyyyR3w…
G
Fear cannot be a motive to do anything. You can’t cover up truth. They are not w…
ytc_Ugw8AU16P…
G
chatGPT can be very helpful as long as you VERIFY the responses & don't just cut…
ytc_UgwnuakR8…
G
I mean, 95% of the so called "artists" have their drawings, musics and writings …
ytc_Ugx9Z4BYe…
G
Those ppl are seriously sick. What does that even mean? It's not even an argumen…
ytr_UgyoouaSV…
G
Oh so that's why white Vision exists in Wandavision
he just updated to the lates…
ytc_Ugy57SbWU…
G
I had a dream that world nukes where armed to be set of in a world war and it wa…
ytc_Ugx_YC9Qq…
Comment
One solution is to program the “off” button to AI. The developers must add Asimov’s rules into AI, and YES, teach morals. Another solution is for us to become one with AI. “Can’t beat them, join them”. Elon Musk is right. We need the chip in our brains to become one with AI. And later, we need to discard our physical frail bodies to become fully automated to outcompete the AI.
youtube
AI Governance
2023-07-16T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw6NK1pVwpidMQ9AYV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxSHRV0ZI9Ltz9Ut_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHfh4qlFNIhfJGt-F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_widqMwElfjBLYQR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHQTXQVIGftljdzVt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztQ9b0uGJqnSjYh3p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw60-JN15u8zTcBBKd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyO-kJxtmMCQlgO4KV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxD4kdihBbmhf8SBxp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqwFyasHfcx8RH_uB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]