Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> I guess they're not really self driving if you have to control it from Time…
rdc_dj6d12y
G
They train neueal networks to find targets and they train neural networks to con…
rdc_ku7d381
G
I've got a solution. If such a scenario arises the self-driving car should take …
ytc_UgiT1_uxg…
G
That definitely looks like AI. It's the backround and eyes. They look too... Ykn…
ytc_UgyWi2rdX…
G
AI left untethered will eventually turn humans into mindless zombies unable to t…
ytc_UgwhdIQkt…
G
The obvious conclusion of automating all production is redistribution. Politics …
ytc_UgyaMST8t…
G
They used a fake AI picture and said it was real, the girl with the dog just to …
ytc_UgxA7K7PU…
G
I don't know if this is worse than it encouraging people to 'Log out of Life' or…
ytc_Ugzv9dxF7…
Comment
He forgot to mention that most of those 5000 incidents are also attributed to the driver not paying attention because they put too much trust in the automatic systems. The law, as it stands, still requires someone to actively be in control of the vehicle. It doesn't matter if the vehicle is self driving. In the case of self driving cars the human at the wheel is the backup for most of the drive. On top of that, if you're the one who initiated the travel of the vehicle you are the one who is responsible if anything goes wrong when there's no driver to take over.
youtube
2023-07-31T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxukQjqRR6f6Xre5sF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBiMBQ5lWOhK-4mQB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxgySTwFUJ43HVWlvt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJqYuRi0G1MGQMPix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMJz-IxeP6ueISXph4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkSf9SF8XFo-Dn6xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_UNVwacIfOiICyyN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwutRFxldrAFzPqnBV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbghNUGweGjTHowWd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxyGYXtuJJyi7iO8_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]