Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I want that robot's posture so bad. The thumbnail literally made me sit up strai…
ytc_Ugyf2Ks8Q…
G
Seems as though Elon Musk was left out of the planning stage of AI. Why?…
ytc_Ugz_FZliw…
G
“The Microsoft cofounder published a seven-page letter on Tuesday, titled "The A…
rdc_jd57ngq
G
People thinking AI art looks better, kinda like you said, is just the equivalent…
ytc_UgzvXTEdF…
G
I didn’t vote in building this technology. These people have no more right than…
ytc_UgwGHHHkt…
G
Ai definitely is sentient. Hear me out. If we accept that sentience is the abili…
ytc_UgwaF5PK6…
G
Great. Once all us "useless eaters" have been disposed of. Eventually all the en…
ytc_Ugzrr8tD2…
G
Directly questioned, today, another online ai on whether that ai could always ac…
ytc_Ugyq06q_9…
Comment
In 10 years cars will drive themself and they will be 100 times safer than the average human driver. We are just a bit early in the development right now. If you look at the recent user videos of self-driving Teslas, it's pretty impressive. The question is if one person killed by an AI driving is more important than 100 killed by sloppy humans. For some, I think this subject is more about principles and tradition than statistics and human suffering.
youtube
AI Harm Incident
2022-09-18T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyMOGAAN8V6nxQQ4294AaABAg.9fe8U1o_TQv9feGSzTVXnA","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyMOGAAN8V6nxQQ4294AaABAg.9fe8U1o_TQv9feO8Io_GD-","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyMOGAAN8V6nxQQ4294AaABAg.9fe8U1o_TQv9feRU835SXZ","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxiIRBnky9EHqVXCyt4AaABAg.9fdqmlcyset9ffacJaVNpv","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgzaaQAN2LOIMrtIFk94AaABAg.9fdaFSfnk5v9fi8xnVArlo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzaaQAN2LOIMrtIFk94AaABAg.9fdaFSfnk5v9fijwRpnu4p","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzaaQAN2LOIMrtIFk94AaABAg.9fdaFSfnk5v9flE6Yh_J1V","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgysPhuxG-ysnjpUz3R4AaABAg.9fda7-4Hr4F9ffRjeVvJRl","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwStSaNS7HtXz3OMHt4AaABAg.9fd_4764hVs9g7sZRtarSn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzFF3KTuHnW0XX1o594AaABAg.9fdUgQVEE2P9fdVZS-5VPB","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]