Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Personally when it comes to extremely specific sources then I think AI is only i…
ytc_Ugx9qdDNc…
G
Because the art was fed into the algorithm without permission. The lack of conse…
ytr_UgwGc-PAn…
G
And where do you even think these pictures on google and safari come from? Googl…
ytr_UgwuwxkU1…
G
Ai can't replace all jobs. Ai still needs humans to perform jobs that are not im…
ytc_UgyAX3Ve9…
G
Because the ai is a generated image using the rendering of a skilled painter.
Th…
ytr_UgxBaA9u-…
G
Ya know, I installed my company's security approved version of OpenClaw last wee…
rdc_o8cru3x
G
Bluster and posturing of an industry with no future. I would be shocked if we wo…
ytc_Ugz1ErMQM…
G
i have absolutely nothing to do with drawing, art or crafting, but got random re…
ytc_UgztjAECa…
Comment
So by Harari's reasoning an AI told not to do anything stupid by humans would observe whether humans do stupid things and if it judged they did, would copy them and do stupid things. Its first thought might be how stupid it appears that intelligent life creates machines capable of destroying itself. Then it might copy that example too...
youtube
AI Governance
2025-07-22T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwBmIud28l_qtSRqT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ-0xOEbWLoLhSZFV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzL9HzXDKb9ha0i6Rl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgztrpcDLVlnizzWtht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7vzxbFJBL1M3BlBl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6M_WEJeFM6BYGLfh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxdfryuQSOLLmxPnfh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznzNf6vSFBqlh3F4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFPPFg-dMfhv7Z8cd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwoE1RjmODEQJwnjZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]