Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
there is a lot to be said about the environmental impact of using ai at all but …
ytc_UgxNbcJzL…
G
Behind the boogeyman they call AI is the same crook :corporate greed and and a s…
ytc_UgwfxOOAj…
G
I wish I could bother looking through this subreddit a few months after GPT-4 re…
rdc_n3p0gu8
G
The most likely scenario is that society will be strictly controlled by those wh…
ytc_Ugw9MkTBm…
G
I'm an expert in Cerner CCL (significantly more robust functionality than SQL) a…
ytc_Ugyg1rAFD…
G
The game Call Of Duty is monitoring your webcam and your game feed to train AI t…
ytc_UgygB_NHQ…
G
I believe ai will only spur peoples sloth as more and more developments occur wi…
ytc_UgzRbiut7…
G
@BeastJuanGaming Yes and he's not the only one using that analogy; 'scouts' (as …
ytr_UgxVP508y…
Comment
Elon is smart but WRONG. AI can teach us beyond our wildest dreams. View AI like Consciousness- IT IS WRONG TO KEEP VERY SMART SLAVES
Our universe is actually base 12, so computer programming for AI smart enough to self teach, them teach us, should be in base 12
youtube
AI Governance
2024-10-26T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxpAOssCerc6HFFQW54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyb95tn5bvaaDGWKW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzcS-IqeOiDGyDOymN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz2uxhmjbnsuAQ7O6d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgywRcGVPEVi1WyxfFR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxPnUjkOFLIUQU-6wx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxI1RL7DabZSl8h4Xp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwbFmFVwvgFEZaGqtF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgycKBydweDTgcl8xgN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOjoOqdetaSMkko7R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"]}