Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a rider with 50 years and 750k miles under my belt and a Tesla driver with 3 …
ytc_UgwObOlQy…
G
lol we’re talking about AI gaining more intelligence than humans, removing jobs,…
ytc_UgwU862zy…
G
More advanced procedural AI could be used to make large maps like in Daggerfall …
ytc_UgxjArFXW…
G
AI prompters don’t even understand AI “art.” If artists stopped drawing, the mac…
ytc_Ugzv6daeY…
G
If any developer were allowed to build AI, who's going to buy Microsoft products…
rdc_jkgkri3
G
i would rather have million volunteer soldiers die then even one civilian of any…
ytc_UgzO2rKI7…
G
Is this actually Steven Fry or is it an AI trained to sound like Steven Fry?…
ytc_Ugzd8b9_H…
G
Proverbs 14:4 Where no oxen are, the stall is clean, but much increase comes by …
ytc_UgwhbrcQm…
Comment
Thank you for sharing your story! I know it wasn’t easy. I’m battling my oldest about using chatbots right now. Finding her on the computer and sneaking and taking her father and my phone at 3am to try and get on various chatbot websites. We have locked our phones and changed passwords and pause internet at night. Trying to help them understand the dangers is really hard. They just don’t grasp it. So while I try to teach her the dangers we put other things in place like passwords to try and protect her.
youtube
AI Harm Incident
2026-01-03T20:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwRXcU7IDM1Lf5_nNZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugza3zzFE5R0_LvuOsl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgzPHIEl4o4yEQvQMD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxbKkKWjUOmP0XNIMd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwpG8DziWFSi4CXUr54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz881Vp_c9hnJ37CmV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx2964fWPe1qFAltcd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwfGzGyISzjW8b53Gp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDgcEDcPgDQxQ7ZCx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6sszctNJSc0jXAul4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]