Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So we will all have to work for AI companies in the future. Generative AI will n…
ytc_UgxfS8h5J…
G
be glad its not a silicon valley company, I got 6 weeks when my startup was acuh…
rdc_nlxu0uk
G
Halfway through How the Elite Print Their Wealth, I realized something: this was…
ytc_UgxSPVRRs…
G
I spent months down rabbit holes trying to figure out what they're not telling u…
ytc_UgyCf2cdO…
G
EVERY AI is controlled by the same "ethno-religious" group that suffers from sch…
ytc_UgxOC_K05…
G
I support AI art being copyright free, or rather, impossible to use for commerci…
ytc_UgyaOPOMM…
G
chatGPT uses terms like exciting because it's relevant to humans. so if it says …
ytc_Ugw5A-57Q…
G
You can probably start world War 3 by drawing literally every ai bro pregnant ☠️…
ytc_UgyauXae_…
Comment
Why not instill a prime directive - like the hypocratic oath - first, do no harm at the very core of AGI? I think eventually AI might suggest we jail its creators permanently. Maybe they don't want that ugly little fact out there in the world.
Loved Wall-E. I think it's a visionary classic. Could well be where we're headed. Fat, stupid and glued to our screens.
youtube
AI Governance
2025-12-04T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxmPNSbOP3AtaMr0FZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQxDIWK44KJeHDL0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuCA-bcovEc7SOvtN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyayqKGGemRU9RS2PJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJJYCzVhJVZ5VuT8Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZDfcUyGIJL9JbqHF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2aNc4lmSFSvKfVJd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEDRp7FOgBt3z16I54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugybyi6TT435y7SbBQ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3jTUQ7lPoDbeMft54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]