Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@mb8591 they will, but you can't resist those black-hat hacker. Ai don't have th…
ytr_UgwYIjrm-…
G
People really need to get a grip nothing wrong with ai art as long as ur stealin…
ytc_UgxGFFC-f…
G
A shit ton of stupid people out there that could care less. Literally the world …
rdc_et7ifif
G
@geaca3222- Thanks.
Don't we have to differentiate between LLMs and systems tha…
ytr_UgymX1PqG…
G
I agree with everyone here. I don’t think they care if the videos are AI or not.…
ytc_Ugwi-qK0S…
G
The worst thing about this is that AI could've been used in a far better way to …
ytc_UgxL0dBln…
G
@mynameisharry8739 Of course, TRANSFORMATION is something that AI is learning to…
ytr_UgyKT91Ub…
G
Make dicision indipendently
Learn principal
Copy maching
Information truth
Socia…
ytc_Ugy92Sel-…
Comment
I worked on a self-driving car project. My manager believed that the self-driving car should prioritize saving its occupants, because they are the only people that the car can be certain exist. The other vehicles, and their occupants, could be sensor errors. To him, possibly killing an occupant in another vehicle was better than killing an occupant in the self-driving car when there was actually a way to avoid the collision.
youtube
AI Harm Incident
2025-05-06T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxfOtF47I1ZsgVZQIl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZU9lw8m1Gp3d_grx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9uUIfqzEMdc3jGCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxd_U6lnVrqRTRTVNZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzo4zGIsgNQwpmHtgJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyvj_EcltAPO4vuVU14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylNjp5OBLzOvd8oRx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3sLKS_rZVlCSdqYl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtnHm6-WDQGfqYR-Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBIg-OlidjNUbf47x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]