Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do you have a system to populate an image with greeblies. It's easy for AI, but …
ytc_UgyVJVahd…
G
End of video advertisement by University of Texas was promoting AI developmental…
ytc_Ugzb-LMFB…
G
@MagicGlue15: "The robot isn't built like a human tho."
AI: Haha, thanks for no…
ytr_UgydWV-TI…
G
Bc he was detained by both an falsely accused twice and wouldn’t they make it wh…
ytr_UgyDGhA_q…
G
We should stop calling AI artists something demeaning or bad and instead call th…
ytc_UgyZof-sV…
G
Humans make errors - AI will be based on human input – therefore, AI will make e…
ytc_Ugy0_d1ln…
G
Well, maybe they should stop referring to this system as Artificial Intelligence…
ytc_UgwLGIN4B…
G
@udawg05 "This isn’t to say AI won’t enter these fields". However, from a regul…
ytr_Ugy7GsC3I…
Comment
One thing to consider is that self driving cars almost entirely eliminate human reaction time. This means the car can make decisions significantly faster than humans and dramatically reduce the chance of a collision taking place. Granted there will be some scenarios where a collision may be unavoidable, but in practice given collision rates of self-driving cars vs human-driven cars, especially when you consider evolution of self-driving cars working together on the same road sharing information with each other in order to even further reduce risk, I don't think this will be a problem.
youtube
AI Harm Incident
2017-10-30T11:3…
♥ 22
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgziBNScDMqK7LUbKCJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaLQv3Hr9M9mYKhHV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwrr6SC95igxxi3KxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkhoJUXCOdqrrdO_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwE0M-nX5hQSlgJMIt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO36Rq0dURj-pha1N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzk7FDfT1Jr5iwRctZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLZAyTObU_AerGybR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxgjwd0T5M4e_QWPzp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRFrJblnQUBMcSBhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]