Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know I might not be so concerned if the human speaker didn't sound like a ro…
ytc_UgzBhIovg…
G
@tisseewwww No, you knob, he didn't. He maybe imitated the style, which you can'…
ytr_UgyxS85pX…
G
This isnt a win for AI. Its just more evidence that doctors and lawyers are mid…
ytc_Ugyc20h-7…
G
It's different for one reason.. We usually train other humans to take over when…
ytc_UgwrHRsSM…
G
I don’t think we are actually realizing what happens if AI fails at this point. …
ytc_UgyH3poR6…
G
do you know the difference between intelligence and wisdom?
Wisdom comes from Go…
ytc_UgzvfU-d9…
G
It will be way more than 100 years before we play midwives to AI but yeah if you…
ytc_Ugwb0S3Zg…
G
You don't understand the etymology? They ripped off Claude. Is that not clear? W…
ytc_UgzhjoGN4…
Comment
In theory artificial intelligence sounds like a brilliant idea stop human death and injury but in the long term it would probably kill more humans than it would save the logic being to ensure peace it would elimanate the human threat I think it's a bad idea
youtube
2018-05-29T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw8VcCmDsHAb83DpnV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJhY0Q3whDnLh_vSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHB1Qraoab6sUe-b14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw5ppOHeOI8NDasPex4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmIQltZxx3SNHW2ix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxoLfLSLyAE_TFWUL14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy0IwWNwz5CbGgh4Cp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1aRhd_2R2pWLhMO14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxf07ulLAVDBQ_GSCF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZYda85tBp1GUREC54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]