Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah the main problem is ai is not considered human so the copyright laws are no…
ytr_UgweFOoQE…
G
And indeed AI is racist as shown by Gemini, because the idiots programming it ar…
ytc_UgwZehG4P…
G
There needs to be laws restricting how much AI assistant bs companies are allowe…
ytc_Ugyp-yocn…
G
To be honest, the reason why I'm not necessarily 'with' AI but don't hope it sto…
ytc_UgwEgodXj…
G
so your faith has taught you that only the afterlife is worth living, that peopl…
ytr_UgywXfkvB…
G
Demonically possessed Artificial intelligence, yes demons attach themselves to p…
ytr_UgxvCJFtp…
G
I can picture the creator sitting at his computer when he was designing her...B-…
ytc_UgyJ__bm1…
G
I know a serious tone is not called for on this channel, but as a portrait artis…
ytc_Ugw8dM_lX…
Comment
We are so smart and yet incrediable dumb, and will probably destroy ourselves. Study Drakes Equation, just because we are intelligent, or because we are so flawed in moral while having the weapons to destroy our planet a 100 times over, we are in grat danger from put ourselves torch extinction! We are Chimps with Space Ships and Nuclear weapons. Good combo! I would understand why AI only can come to 1 logic conclusion when it comes to humans!
youtube
AI Governance
2025-01-04T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxkKlqo6OcPf-IwpAZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzXtE0bC1xnqHPRSvF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_Ugwrb8gAG6wMI7STSHp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxBQTpM7mrTOJTzvRJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugwml5891izAJSu1MMx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwrgfhsVaU1Ff5FwwB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzhahkBCl6TeGSQ41F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy0zHdceos8smts2mR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwJU-jS2mWEdHi2AWt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz3UskuZroGT_Mqwnt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}]