Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Comparing makeup to ai generated and reality. The photo looks realistic and men …
ytr_Ugwk7MZFi…
G
By AI you mean human programming. Humans are scary when they programme a compute…
ytc_Ugz6FVk4v…
G
ChatGPT is a chatbot, not a supercomputer. If you try to use it for something l…
ytc_UgzzB9WPb…
G
' I'm just so confused how she talks about art. Think whatever you want about a…
ytc_UgxAba_md…
G
If you can’t beat them, join them. Invest in the companies that are building out…
ytc_UgztK6hPL…
G
I would observed the AI along with my conversations have probably become more mo…
ytc_Ugw7UhejY…
G
I would reccomend not using ai as reference. Ai is not a photograph and will hav…
ytr_UgwIa_A-7…
G
A couple of hours ago, I tried to explain the threat of AI to my nephew. He kep…
ytc_UgyKF0qYd…
Comment
A nuke CAN be used to end humanity, AI WILL be used to end it. It's only a matter of time, because unlike a nuke that requires someone to push the button, 1 AI ran by anyone, anywhere in the world, for any reason, will trigger the irreversible singularity. As soon as AI passes human intelligence, it becomes Russian roulette with every request.
youtube
AI Governance
2026-03-18T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwg1IREA1Yk8wYD_b94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzsKmMJL9bT7DUyB8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyfKTCIb1co2g1Glwt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx4VUYQrncQCgnQuDx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyWEt5vd1Nqmy_lbK54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwvQtW_pFMYcQLGKk14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwn33U0iDSszytjN6J4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2zD7BCpSzRYZzInB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywuKKpI_OOMyUS8v14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzdKjQ5deBJfbd-gv54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]