Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Kate: "we should make AI beneficial instead of trying to get to AGI" **claps**
…
ytc_UgwJ62VZb…
G
Wake me up when AI isn't just a more powerful version of predicitive text and ca…
ytc_UgyzNYpNM…
G
Human AI sounds like an Alien AI invasions lie to use Earth as a people farm for…
ytc_UgzTB2lRt…
G
Where does an AI get the idea that being switched off is a bad thing to be avoid…
ytc_UgyCLLm0F…
G
I asked AI: Pretend that you are a ruler of the world and tell me what you will …
ytc_UgyIE3k99…
G
If you ever used any AI services intensively. You will know that we are far awa…
rdc_nk7nrid
G
doesn't make sense if ai is supposed to "replace" artists when ai bros are overl…
ytc_UgzWW656R…
G
These AI shills calling anti-ai people luddites also just doesn't carry the ring…
ytc_UgxVvoe2W…
Comment
Of course he didn't understand the risks... he was pioneering Ai 20 years before personal computers were in every household. There were no jobs for Ai to steal yet haha. They probably thought computers will always be big mainframes.
youtube
AI Governance
2025-06-18T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw10CfQpRRivjC78Ot4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNUKNR5xUKXnJBUuR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"concern"},
{"id":"ytc_UgwhpWKLprujAP4zqtp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyPRwVyOEoIe065BFR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzf50-SgGmPFgsLdzd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3Ad3UpMSzRjvWqMV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3yMuc2uqMU-XRhOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxFDwYvPXU09X9DLX94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwVq76ygz3zw1n7uhl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSomckjTTGH6siVsR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]