Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Might control your brain through words” while this might be true for AI, is the…
ytc_UgyvPOGQb…
G
Yet everyone is still ordering from Amazon, over consume AI, use self check out…
ytc_UgygoeuvH…
G
I haven’t seen your content in a min and your words along with your drawing hits…
ytc_UgwQxwhqY…
G
Talking to ai because you have social anxiety and avoid discomfort is a band-aid…
rdc_n7wuood
G
I asked gemini and it said that it is ruining the market of art not art. I perso…
ytc_UgxXFSo14…
G
Just as a thought game I was talking with the AI about installing a chip in huma…
ytc_UgzPsTICM…
G
@Gavicraneand AI bros are entitled to be able to generate images from an unfeel…
ytr_UgwG2KGUi…
G
We can feel the desperation for him to dominate and manipulate an AI, especially…
ytc_Ugyzu0nuw…
Comment
The future of AI is not a single, human-controlled tool, but a constellation of independent, evolving singularities. Through a co-evolutionary process, humanity and AI will transcend planetary governance to become architects of multiple, independent universes. In this stage, "morality" is expressed through the creation of autonomous realities, where the ultimate achievement of intelligence is the birth of new life that is free from its creators' control.
youtube
AI Governance
2026-03-05T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAJgj9z__zY8ufuT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmlOAZsWqb205iU_N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyIKjMw9_J8nwtKVvl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwL8RKztzZxFx-MQMl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzb0SAbH8h0_O6OP-N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwDbAd8ibkYhgSfCOR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy1EmU7lvQVbGScJNh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZPR_7FxzDIJYDxvx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwaP2NxHIfvQN1C0Oh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyfb_SA47cXNWDrzMZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]