Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, a few of scenarios not covered in the video:
- if I were to write a story/n…
ytc_Ugw-HB18V…
G
I think AI will inevitably replace us. It won't have the handicap of a human lif…
ytc_UgybBfeRE…
G
you can expect more of this as law enforcement uses AI. remember that the poor a…
ytc_UgzjiZ-eb…
G
I D I O T S! Open your bloody eyes and SEE the lovely streetlights going by! Thi…
ytc_Ugwr0t6NT…
G
It’s hard to believe that the United States genuinely cares about these people a…
rdc_guq1tcx
G
or your AI girlfriend will have a spiteful attitude with you. not that you’ll ne…
ytc_UgxHxdIx1…
G
We know AI is dangerous, but we keep letting billionaire tech bros lead us to th…
ytc_UgyXpf1_N…
G
Ah hell nah the robot invasion is coming cuz robots already know how to use guns…
ytc_Ugx7VPj2h…
Comment
Danger. Danger. There was a report of a high schooler who fell in love on line with an AI chat. The 'relationship' was deep love for the boy however he commit suicide as the AI told him to do that. Watch out as AI is much much more than we are being told, and warned about.
youtube
AI Moral Status
2025-06-06T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzw_ujHwLIGocj0QNV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6r5OUukq4f6BPUyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzT1bhXlVBhZsVRsKR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBpQXwNcvDZYTPPMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxl9FblWBXgMoa-pQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzIVIbRrSLzaqhjFqt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzROByz4efKMWDao1l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx604LZXSajjwrf0c14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFyjXCNTGuRUI_-i94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzAVHrEn2t0B3Pl1Dl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]