Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In my opinion the only aspect that avoids this catastrophic scenario is that:
1-…
ytc_UgzqIGnbk…
G
I think the biggest problem is that you have read so much of the BS and optimist…
ytc_UgzY4WLqa…
G
I agree that the idea we really do not know the consequences very good about AI.…
ytc_Ugxb4GwD4…
G
you: r these just robot or real people??
Me: let me just get…
ytc_Ugw0j_vka…
G
AI is not magic …it’s only as good as the data it trains on. We are far from one…
ytc_UgyBYWyp0…
G
rofl...the re..nvm the woman says the big tech corporations are controlling the …
ytc_UgyLtMTsQ…
G
“I’m a Marxist!” The host says and then the guest smiles. That pretty much sums …
ytc_UgxMm15i0…
G
Yeah, I don't think it's going anywhere. When Putin said whoever wins the AI arm…
ytc_UgwTu0eW0…
Comment
From a 2016 interview:
"During a Q&A session following Tesla’s announcement yesterday, Elon Musk was asked if Tesla would be liable if one of its driverless cars gets into an accident. Musk quickly answered that those types of incidents would be something for the insurance companies to figure out.
“No, I think that would be up to the individual’s insurance,” Musk answered. “If it is something endemic to our design, certainly we would take our responsibility for that.”"
So is the great Elon going to stand up and offer these families and individuals a sliver of his fortune to for the hardship his company experiments have caused? Or will he and his lawyers bury all these claims behind a wall of paperwork and lawsuits?
youtube
AI Harm Incident
2024-12-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugymxs7I-T47d-2EPgl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxS8QN44BNGQCNwYK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz5A5m2klmeRivyB5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwS_LepYfJbvFnKSMR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzebMplL1mA9E_4LPF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8rez9_oL_bu9knnV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzS8ooHOA5c2aTeTiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzfLUOfQuTAP5UQmkF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwiKaMdsF4mK-Aom514AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugydqn1QFkeeyNgxTd94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]