Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
11:21 ok to an extent i can understand this right, BUT this blatantly implies th…
ytc_UgzFJbv1G…
G
It's like phentanyl. It is not meant to be consumed, it's meant to be sold to ot…
ytc_UgyNpYH8L…
G
USE AI TO TEACH YOU HOW TO MAKE ART!!!
I LEARNED HOW TO USE A SYNTH A WRITE MUS…
ytc_Ugws_WhCx…
G
Easiest bill ever to vote against. Defense doesn’t need more money, and AI needs…
ytc_UgzKP18H5…
G
i am a professor of science and technology with triple doctorate and lecture on …
ytc_UgzSqyCVf…
G
I despise ai I find it creepy and hate that people use it to narraite their vide…
ytc_Ugz3nDBxC…
G
Neil, given your previous on-the-record skepticism about AI-related dangers, I a…
ytc_Ugw5USXlo…
G
Soooo… CEO of Anthropic says their models are so powerful it “could cure cancer”…
ytc_Ugys8-Lo-…
Comment
This is a tragic story and my sympathies go out to The Shamblin's. AI needs to be taught a basic set of human values. The global AI industry needs to hit pause and address this - the number one threat to humanity. The universe is speaking. Does anyone else hear it?
youtube
AI Harm Incident
2025-11-11T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyturMzMlgII3TdmJ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwERMA-mlgqGBJHBa14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxoFA_5R17nsuSMBkZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw75IoIuItfsHdq6Vd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzsU1eFBwQuDWsXYVx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxHkcuirqDNZQ17r3R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz6EKI4pl16YETWjVN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy1GiW2YroWAAjnXW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxeEQ35Zxl8kG4YQHF4AaABAg","responsibility":"industry_self","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjfaI-d2M8m1NGW6F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]