Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AI_In_Context Don't forget that his "concern" surrounding A.I. was, at least ac…
ytr_UgzZtvGMW…
G
by the same logic, everyone everywhere is still talking about AI art, so, there …
ytr_UgyuXdxH7…
G
The apocalypse sells, whereas the equivalent opportunity, call it AI paradise, d…
ytc_UgzxaU3EJ…
G
Believe me, AI is just a tool for humans to use — not to replace them.
Sure, we …
ytc_UgzTRbugu…
G
By Jason's logic because I made an AI art program make Robb Stark as a space mar…
ytc_Ugy-Cb6JV…
G
saw the video title and said "i thought i was the only one doing this with chatg…
ytc_UgwLFUyB5…
G
A very big part of teaching is relational. The relationships children have with …
ytc_Ugz88M7pV…
G
"Animals are born with the purpose of survival. It's the most basic instinct. Ea…
ytr_Ugy5DaglR…
Comment
Mis leading reporting. She said the cars camera failed to recognize a stop sign. It didn’t fail. Autopilot software isn’t designed to recognize stop signs. You can’t fail something you weren’t designed to do. For those unaware the 2019 incident the driver was basically using Tesla cruise control and not full self driving. fSD is the option that recognizes stop signs, red lights, etc. whether the driver knew what the capabilities of autopilot mode is was unclear. All I can say is that I believe most people driving Teslas know that autopilot doesn’t recognize stop lights and stop signs so why this guy thought he could take his eyes off the road to look for his phone is nuts
youtube
AI Harm Incident
2025-10-19T19:1…
♥ 13
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwV5UQXPw2H4R9Uzqt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKR2UGzjgsCR85Oal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnQ3HCy1z-6qClJq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXgHJvTtHvLLYiIb54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-hsjLkvAau0I8DlZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugysg6rwsyzaIoJhnu14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE0nyxr64eRE3ZhQl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwloWy2D0uUw94JxyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQhUzoOQhvb6E4Uv14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx1jTYI9x8D9xXm5Ml4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]