Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Idk, it will happen but last study currently showed that mid level programmers t…
ytc_UgwChwalv…
G
The more important question is if we provide code for consciousness which belief…
ytc_Ugx1WN2Ut…
G
I once had a chat ai spit racial slurs at me out of nowhere. I’m pretty sure the…
ytc_UgzvBkFk-…
G
A.I. created and programmed by idiots. Hence a.i. artificial idiots, No! Intelli…
ytc_UgzUnYUa0…
G
@markupton1417big difference here the jobs won’t come back. Everyone predicting …
ytr_Ugzo9hCKL…
G
I was motion artist, usually I fetch adobe stocks vectors images, psds, usually …
ytc_Ugwf5NCTK…
G
@ed9121You have zero expertise in traffic safety, because if you did you wouldn…
ytr_UgxPNzGFu…
G
I work in this field, and I can say with confidence that anyone that is touting …
rdc_ktt0gpa
Comment
It's delusional to think that we can control something that's orders of magnitudes smarter. Just ask yourself how much control a fruit fly has over you. It will be the same with superintelligent AI. The entire AI "race" only exists in the heads of CEOs and politicians. You cannot control the world with AI because you cannot control AI at that level. Peak human hubris. We MUST demand a CERN for AI and we MUST protest to FORCE politicians to make it illegal for large AI companies to work on their own. And that CERN for AI MUST be an international collaboration, obviously including China. That's the solution, it's not that complicated!
youtube
AI Governance
2025-12-04T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4vmTG-8iDRggmGUx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy12WJElNBpRtx6K4V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxYUpsj5jdLHHd-Zct4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNWIOitL5NssHegwZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYye2iDkiH6LRG0Wh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaNXwJPuk4BpY71VF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwK8rWcVdrfE3q0f694AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxRS2GFkVX21qEHEC94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwDt7TRBj84NwsUMqV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwvznqkTnsExAcrqvp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]