Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We don’t want lots of goods and services for not much effort! We want to be huma…
ytc_Ugyhx5up4…
G
You need to discuss the concept of AI systems merging with human physiology I ca…
ytc_UgzzmtUI6…
G
Some jobs like radiologists could see a huge drop in demand as you will only nee…
rdc_jkq3o1t
G
nah but fr, my dream career is being an animator and i swear if ai videos take o…
ytc_UgyBpptnO…
G
Reading Robert Harris’s book, The Fear Index, I thought it terrifying but unlike…
ytc_UgzvI4A3i…
G
First of all stop treating AI as it's a living thinking intelligence. AI can onl…
ytc_Ugzhct9Ox…
G
you should interview Dr Federic Faggin author of Irreducible and also known as t…
ytc_UgyKYBMGz…
G
If/when AI becomes smarter and better than humans, we will no longer be the apex…
ytc_UgwT75Tb2…
Comment
So humans created a society of slavery, a pharmaceutical system to enslave people, war, weapons, pollution, nuclear warheads, blood sacrifices and sex trafficking, lies, animal slaughtering and more..
But now A.I. is dangerous?
Hahahaha yes, could be even for you billionaires probably your money will lose all the power
youtube
AI Governance
2026-01-21T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJQZ3UUo8G9d2fXcl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBTb3sHpWviIa_tdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyC8gcwYBvttDPPNQd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzXnM-n8k-_oMtttE94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxF1Heg-jtUJMnxVC54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzBhMPRzQzSe410Vzd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiDsHo6lvvSs3bISR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx-Bphx7CKFgfDSi594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugyx0OE8sMbQgFP9wBR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwx3nb_admn81fx73t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}
]