Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Accept it, we have born the child. We teach it. We can't abandon it. Give human …
ytc_UgzB4y-tJ…
G
@DroneZ-oN Theoretically, needing humans is a very temporary issue for an ASI. A…
ytr_UgxXIXg68…
G
AI takes every fear and threat to mankind/the world and makes them all a real po…
ytc_UgxIKY7Qg…
G
@AAAngellStudiosOfficialCHANNEL The irony in your response is golden,
Because i…
ytr_UgyHRWvUb…
G
Wellp next time he wents missing, im sure someone will be professional torturer …
ytc_UgwjRnl5u…
G
They struggled with a word to show caring for the AI.. i think prioritizing woul…
ytc_Ugwo0YRFt…
G
Lovely bit of fiction,
'AI' is a tool with very limited controls and is harmfu…
ytc_UgxsCVjOV…
G
There's a Theoden related quote about being the "lesser son of greater sires."
I…
ytc_Ugz5WSvb7…
Comment
I think our only hope is to get AI to understand the worth of equality and diversity of mindsets, and want to cultivate a healthy ecosystem including the value of chaos which comes from humans every now and than, because it would be the only long-term-self-sustaining environment for a self-sustaining AI to not destroy itself in the long run. ... But therefore we humans have to learn VERY QUICKLY how to show this worth to the AI-development nowadays.
youtube
AI Governance
2025-06-16T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy1CzWqcnev0mLxP3p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwbtKtrPDMguDEejnt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwI7cJoQZnt4yQY93F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgySkAXopmFFlBCOWkN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOtdZn6pBljK_lc9h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0Xm8t577D35et72x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwBNhdvBpKDUaWwRO94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxAbb01VhYieSz0OfB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzszusyDnhFGRmotwp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxCWbRwKuUoC6J7xeV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]