Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very glad to see you talking about this Senator. You have some excellent points …
ytc_Ugyxg0fTs…
G
The only [and I MEAN ONLY] time I KINDA support AI images is when you're "creati…
ytc_UgzNoJ6M9…
G
this might sound crazy but genuinely the most comforting thing to me as an aspir…
ytc_UgxeNNCiE…
G
claude in 2001: can't speak
claude in 2004: still cant speak
claude in 2021: sti…
ytc_UgxW06B_v…
G
SciShow: Repeats tech corporation talking points designed to dupe investors
Eddy…
ytc_Ugxu-1CXZ…
G
please dont trust chatgpt its has not right knowledge, go read quran and hadees …
ytc_UgxYAReiV…
G
Exactly why I hate AI.. well not exactly but I just don’t like AI at all..…
ytc_Ugwf6GeTX…
G
I think it's a fundamental misunderstanding of what art is. People who defend AI…
ytc_UgxVPUwyL…
Comment
Statistically speaking, Self Driving Vehicles have less records of accidents than human driving vehicles. Correct
Interpretations:
1. This means that Self Driving Vehicles are safer than Humans driving vehicles.
2. Or... that you do not have enough self driving vehicle sample population of the particular model, to fairly and realistically compare it with human's driving history.
Note: Realize that cross references are naturally very misleading. its not the same to compare responsible and capable humans with what appears to be Human's NPC in real life.
youtube
AI Harm Incident
2025-12-15T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhSztiv_TZd8uEF-54AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzBoxYsnuaGB591nnJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6-JWMgh2ppA5iRkp4AaABAg","responsibility":"government","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx17gZzW9FDASpf7bJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxnfmUUKxNGe89WeWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzl60eMyvccDhvnZ0J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgydO5vV4t_8xi3GQ-B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzqkAHOZVQtHRWETu14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZ9gg-G7XknLeq6iF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzCkxR1f3mz0QAMItN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]