Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Trying to align an AI is like getting a person to do what you want to do. You ca…
ytc_UgyiuDkw2…
G
Driverless vehicles are not a bad idea. It should mean no road deaths. Just imag…
ytc_UgzTpeueC…
G
This is a fucking stupid argument against self driving cars I am tired of seeing…
rdc_dmp6sw7
G
Well, technicaly we arent alive outselves. If a robot was programed to evolve an…
ytc_UgxuPwG4w…
G
LLM's replacing even half developers for a company seems very...Naive anytime so…
ytc_UgywFxdQ8…
G
The problem with A.I replacing humans in the work force, creating things for hum…
ytc_UgyDaGYtl…
G
@themorenajay Ahh I see. I didn't mean to come off as hostile, some comments w…
ytr_UgzL4MtQS…
G
The AI tends to think it knows better. If you follow along while its thinking.…
ytc_Ugx8JG3yI…
Comment
yes semi-automatous vehicles are deadly when you ignore the semi part
didn't people leave the vehicle to drive itself when cruise control first came out too? some of them are still with us today, but many didn't manage to win the coveted Darwin award.
youtube
AI Harm Incident
2024-12-30T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugzlilf0kjcmOnrv5xt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJWIDt6oOiorm5J754AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIZVw6KZCzKCEbXMx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFs-QA1DH-NXdQTLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyZKKhyCmaofP2AyTB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRDicqcNmcNWbYGtl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQ97OxvHySPKgo3n54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugym16jXfrxj8jmWybl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxmw3WUMY0VbzNONA94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_6xhJE7PCeHxJkLt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}]