Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look up
"Comparison of Waymo Rider-Only Crash Rates by Crash Type to Human Bench…
ytr_UgzA6t4BC…
G
AI is ok if it's assisting human effort, it's not ok if it's doing all the work.…
ytc_Ugyfcbpcj…
G
Cool but its called supply and demand and some ppl dont wanna pay 800€ for a com…
ytc_UgwDiVQot…
G
Interesting observation that AI can't provide human experience. But that said, i…
ytc_UgyXdQ5gu…
G
Im a 3d animation major, and AI model and animation generators are taught, it's …
ytc_Ugy19_ZVv…
G
As somebody working in healthcare tech, this should NEVER happen. What should ha…
rdc_jtfmlbi
G
You're missing a bigger picture if you're not even a bot. This, who's crazy abou…
ytr_UgzWJlPTO…
G
Let's see
- used no prds
- gave vague and too big prompts
- expected one shot …
ytc_UgxuvOikA…
Comment
It is a thought experiment, but self driving cars don't work that way. A safe distance is always kept between moving vehicles. Here's a taught experiment : If you would be in charge of that decision, what would you do? Provided that your reaction time allows for quick decision making.
youtube
AI Harm Incident
2015-12-08T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgizunohajILCHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjjWOUDi8MzcHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQvTuYsrqOtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgghF14lWrWg93gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghSobsLJzKwTngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghTsPIeRMcNT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGpAhmNNMkf3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjlPAxVCSrTmHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugju10Xr0tXdF3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjc3KGPZNZyqngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]