Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only way autonomous autos will ever replace taxi cabs is if they're programm…
ytc_Ugw9pSGKS…
G
Having AI relationship isolates you from real people. These AI tech companies ar…
ytc_UgxLoWvd5…
G
Do you really think there isn’t a single job AI can’t replace? It’s still just f…
ytr_UgzX-_g8G…
G
Wonder if he can develop facial recognition with masked people? Now that would b…
ytc_UgwDc9oa4…
G
Man, this feels like the laziest hit piece I have ever watched. A few points:
1.…
ytc_Ugwm5s3Lm…
G
As for Claude 4 having an "instinct for self-preservation" I'm not so sure that …
ytc_Ugz4anTgj…
G
A.I. is not a lookup table. It generates the answer based on the question and it…
ytr_UgyM5LKiZ…
G
How sweet from the godfather of AI that he is only concerned by HIS children’s f…
ytc_Ugzvpvbr8…
Comment
So far self driving cars have killed a pedestrian. Also, a user was able to find a weakness in self driving Teslas that he knew will cause an accident. He showed it on another video. It had to do with taking an exit on a hill. He could see ahead of time, but the car could not.
So, even thought they might have 1000 years of driving data, AI can still make mistakes.
youtube
AI Harm Incident
2024-04-15T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2Eam8enQJJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2EbmVWX5QB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2F4IIrYzpR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxO_Ujk5rSvOjWRKBB4AaABAg.9qpyK0rFpPmA2EdMqzaPey","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugxr6E9-mHJqZTtbMkB4AaABAg.9n3_k2CWmk5A2EdhXW4VlG","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwCHoJvEcws5sbtNMJ4AaABAg.9gAd7y-h4HM9usT1V03sv4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxhzhse5PJvVG9QIF54AaABAg.9eKDGjvyiy7A2Eet4FGYqd","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugw4jM93_9cAtGe9wgN4AaABAg.9Wp06dt0zPM9ckPtZde_5N","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyfZZIYhGkJqOVkj3p4AaABAg.9NJbV2HYicl9UBo94mRsqU","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxx9whhaDWkfEJOjy14AaABAg.9GhF4K2osdN9KjCloO1jaE","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]