Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like this is a generally decent overview but I'm a bit miffed at the apol…
ytc_UgwAzx4nM…
G
A.I. has no clue of Dutch clothing. It's "creation" show American women.
Every D…
ytc_UgwnVuL2V…
G
If the AI is better at it than a human, then so be it. I don’t give a fuck if th…
ytc_Ugzehhexj…
G
@gyperman3751 so why should we care about you? You oviously don't care if our jo…
ytr_Ugy2xbJi_…
G
LOOOOOL that’s what you get for laying off human beings who rely on your paychec…
ytc_UgxfIu0g7…
G
@fearedjamespersonally, didnt mention adding ai to the carts. ive never been to…
ytr_Ugz_USuH1…
G
You've been working on AI for 30 years and now you are warning people about AI.…
ytc_Ugw5ldapK…
G
Yeah cause it's the fastest way to win.
AI doesn't have a conscious. It only …
rdc_o7ohrwj
Comment
The answers are obvious: The car will do everything in it's power to save it's occupants, but it will never deliberately hurt someone else to do so. Furthermore, as previously mentioned, if the cars could communicate and connect with other self driving cars (within a certain range), they could coordinate to avoid accidents--for example, the car on the left could move and give room to the central car, etc.
youtube
AI Harm Incident
2020-11-30T21:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxzrJFHtwQVGAhSoTR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxU5tCZM8HakEDvsFl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMY4iDTn7n18gY6-14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4ux_V7SfOPVtCZ9V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxScl_CUE54ZUahNwt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxSt_7nWVXXZuuXwvV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwalSdv_7oC0WE64eZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzo1Jhy8gfrjzTAU1Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxdohbhxra2IeZqFE54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxx9whhaDWkfEJOjy14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"industry_self","emotion":"approval"}]