Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
there is one fundamental flaw in this video, a human learns and process informat…
ytc_UgygZLBjn…
G
We need to learn a different term than AI. These are learning algorithms, not tr…
ytc_UgzTyJwU-…
G
Why are "Artists" so mad about ai? Because it does thier job better and faster? …
ytc_UgwLabMUK…
G
>Or maybe it's because we don't have the societal support structures in place…
rdc_denk649
G
dear OpenAI, teach ChatGPT about image transparency. Completely transparent png …
ytc_UgwI6xp4b…
G
Pointless post from someone who likes riding horses and building soapboxes.
Op…
rdc_jht0w02
G
Most people are very chill or oblivious about this but the dangers are immense, …
ytc_UgzvGXEYc…
G
The only way to take AI down is to act serious with it while asking very ridicul…
ytc_UgykfwZRl…
Comment
to me self driving cars just muddy the water too much in terms of legal liability for potential fatalities. frankly the idea of being on the road and someone fully trusting their car to drive itself and not paying attention while driving is terrifying to me, although admittedly a lot of people are already so negligent while driving even without autopilot you'd never know they are holding their own and other peoples lives in their hands while behind the wheel
youtube
AI Harm Incident
2025-08-17T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYfFMHVzVyuhQFdMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxaBDG77cwlFTS-1Ox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzUuC0NA5yU6OFDmF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx8sjZXm1A1ydsIgh54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHwWjz_9Kh5zFrLNp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxBScCp17c0Czvh5fR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzAHrislxZ1paCuwUB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy06hydIuwlfoY6EOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxF9C444nMrHGdc5g14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_dtugAD-9e439fwl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"resignation"}
]