Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Replace Customer reps?
The second I notice that I am talking to AI, I immediate…
ytc_UgzyhkJb6…
G
AI does not have any (ANY) reason or need to take over humans. NONE whatsoever.…
ytc_Ugx_rtIfV…
G
AI has now higher intelligence than any human. It means, that human mental work …
ytc_Ugyqo3ZsT…
G
God, yall sound like the hillbillies trying to destroy the machines diring the i…
ytc_UgxovCncC…
G
Its sad that those that really suffer from this are the people at the bottom of …
rdc_czldseh
G
Haha I've been pointing that out (how we've created a massive corpus of informat…
rdc_jmiyavu
G
10 more years and we gonna have the most realistic AI content known to man, you …
ytr_UgyWQO4Os…
G
You should watch Ark Aquadic trailer... AI video , AI voice AI logo... yes that …
ytc_UgxujmYhi…
Comment
I find it rather humorous that science still doesn't understand the human brain or it's hidden capabilities, but yet AI pushers think they're smart enough to outwit AI. The reality is that AI is already on the path to knowing the human brain better than the humans who own one. That outcome could very easily be the end of humans because at present, we can't even agree on basic human biology- XX/XY comes to mind. The biggest problem for humanity coexisting with AI is emotions.... the source of most of human conflict.
youtube
AI Governance
2025-09-04T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx3KUMw1pnBWgIP2BJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzi-8gxZG6N-Sg4yIx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrTcL2rlbliqKJgix4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxC2q4dM5G8EPnHa6d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzvns5VnpX5JIHgZvR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgycyCxgTpvCU2GSa6J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwuRm5qN_koyYai7ZR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_IFo2-sx9Ba-NW0x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0CAgYL_3bGG-MeyV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhZgF0nyu5g8BIhPx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]