Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine 10,000 Ai robots sent to colonize mars, a blood test which could analyze…
ytc_UgyHTfCi8…
G
The fact I saw a post on Tumblr of a group of ai artists saying they were going …
ytc_Ugzh8V4Y8…
G
half of these sound like a programming intention error. the other half sound lik…
ytc_UgzuoeLK3…
G
I can't help but shake the feeling that there is an unspoken rule about how musi…
ytc_UgxHSPh55…
G
>AI, robots, and more generally automation is the expertise of producing fast…
rdc_j50lx80
G
Honestly I'm getting tired of hearing about how scary AI is. The world could be …
ytc_UgwDizRUk…
G
@Morttozin because for the AI to work it needs a bank of drawings first. And tha…
ytr_UgzdWoEwe…
G
Soon everyone will just sit in a chair and everything will be done for us. Speec…
ytr_UgzSpw7be…
Comment
I disagree I don't think the answer is obvious. The car may do everything it can to save the occupants if thats how its programmed and what about if the car will do everything it can to prevent a major accident however in doing so could cost the occupants their lives? You say that the car will never deliberately hurt someone to save the occupants but you have absolutely no way of knowing that for sure. The video already demonstrates this that what if the only way for the car to protect to you is to purposely slam into someone else? That is deliberate. The cars communicating with each other is a good idea and again what if there is something that that AI cannot do or the AI calculates that if the car moves to the left it could cause another huge accident? Is it more important to protect the occupants or reduce the severity of an accident regardless of risk to occupants?
youtube
AI Harm Incident
2021-03-11T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2Eam8enQJJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2EbmVWX5QB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2F4IIrYzpR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxO_Ujk5rSvOjWRKBB4AaABAg.9qpyK0rFpPmA2EdMqzaPey","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugxr6E9-mHJqZTtbMkB4AaABAg.9n3_k2CWmk5A2EdhXW4VlG","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwCHoJvEcws5sbtNMJ4AaABAg.9gAd7y-h4HM9usT1V03sv4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxhzhse5PJvVG9QIF54AaABAg.9eKDGjvyiy7A2Eet4FGYqd","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugw4jM93_9cAtGe9wgN4AaABAg.9Wp06dt0zPM9ckPtZde_5N","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyfZZIYhGkJqOVkj3p4AaABAg.9NJbV2HYicl9UBo94mRsqU","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxx9whhaDWkfEJOjy14AaABAg.9GhF4K2osdN9KjCloO1jaE","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]