Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow, this really puts the AI job landscape into perspective. I loved the practic…
ytc_UgwkQ6nsW…
G
So we got AI pairing with the law before the law placing limits to AI... I do no…
ytc_UgwS9K04t…
G
That's because consumers are not their customers. Their customers are corporatio…
rdc_ohzkvtv
G
Lol he killed himself after talking to a robot for five hours... Hilarious 😂😂😂😂 …
ytc_UgzmtSZnC…
G
It took me waaaay too long to realize that hiring was spelt wrong on the first o…
ytc_UgwnQzDN3…
G
I'm a developer. My degree was in computer science. I know that the SECOND my co…
ytc_UgwiVUPcd…
G
The chatbot executed Order 66 but is still working on improvements for the next …
ytc_UgwqbV_xm…
G
I am happy for AI take my job and still pay me my wages for doing nothing!!…
ytc_Ugy1lNS1e…
Comment
This video is wrong on so many levels. First of all, the scenario with the truck would never happen, because self-driving cars would always follow at a distance that they can safely stop. Second, even if that scenario would occur, the car is not factoring whether to hit a car or motorcycle, it's not even able to determine that there is a car and a motorcycle, only that there are two objects on the left and right. If an object were to fall from the sky and this scenario to occur, the self driving car would always stop. There are no self driving cars that try to swerve to miss accidents, this creates unpredictable scenarios like swerving off a cliff or hitting other cars. I'm not even going to touch on the craziness of a car trying to decide whether or not to hit a motorcycle with or without a helmet. How does it know the driver is wearing a helmet? Are we 100 years in the future and these cars are fully cognizant.
youtube
AI Harm Incident
2017-06-28T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggQprZepafZ1HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghRftAajpYgC3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiVNu11IS5PiHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugis3gL-vgXrpHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3h2tFVAqPSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj5A0pJm2zcoXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggIGhHRenxDK3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjS60trIUKAvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCEcSJA552hHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgitMxhB_OZFhXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]