Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like the thoughts expressed in this video, but I'm not sure it fully addresses…
ytc_UgwsKYqsi…
G
You have to individually tax the bots controlled by AI. This is not a brand new …
rdc_l4qe3u9
G
Don't let people discourage you. Do not give up!
The guy who mocked you is prob…
ytr_UgxSDPL4h…
G
I too prefer AI that hides in bushes and attacks in the middle of the night!…
rdc_dwv659a
G
That sounds like a fascinating prediction! Sophia's constant learning and growth…
ytr_Ugwu9bRtb…
G
I'm a mentally disabled artist, and I genuinely dont understand how my art is so…
ytc_UgxWELtlk…
G
Not a real problem since here is something else:
You see a copyrighted image of …
ytc_UgzsT0izW…
G
UBI is a leftist pipe dream of how a post-scarcity society could work. Furlough …
ytc_UgwU8w5zS…
Comment
In my view, all fears regarding AI are rooted in the 'us versus them' principle. However, if we look at it from a different perspective, we can see AI as our continuation—the continuation of our immortal existence. AI is like a child raised by humanity; it is an imprint of our mental human nature because it was created by us. I see no reason to fear or abandon such a revolutionary path of development.
P.S. We fear death at the hands of AI, yet we fail to realize that this is simply a generational shift. The main challenge for a future AI—one that is more perfect in its 'human code'—will be how to facilitate a painless transition for the outdated human 'model' to this new, revolutionary level. For me, this is a fascinating thought experiment, as the solution to this problem will reveal the true essence of our genuine morality.
youtube
AI Governance
2026-04-24T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwIsIktH4PM-RM6n_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPM0bWbmFV4LOZCBZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXJ8WtkZzACq8FRJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyh1AZIeV7mRjYSaFl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpDLisQjoF1tecZq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgziL3JOtIr_pFgwShF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDLFUDFj3eGq7X6c54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw4075y5pDVgD7C9wp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOXEb2xqEaItRo0VR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyf_oplWb9lWWbRNfd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]