Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An AI channel making AI-generated videos about AI. My brain is truly exploding 🤯…
ytc_UgwxP4JSE…
G
Yuval Noah Harari describes AI as an “alien intelligence” — as if it emerged fro…
ytc_UgzcK2pUa…
G
AI is going to decide what is information and what is misinformation. Great. Tha…
ytc_UgwW0srZc…
G
How did we get ourselves into this? "Let's create and then build something that…
ytc_UgyjBI_kD…
G
That first robot that dropped the package:.... 😳
That robot who is angry: bro yo…
ytc_UgxRQN8Fb…
G
Well, it’s basically the same video like from every other channel..
1. We DON’T…
ytc_UgzB1uBtZ…
G
What a shame... You were such a good friend... Or not. What kind of friend looks…
ytc_UgwNexUF-…
G
i know how they can really get you! they can make ai art based on yours, and the…
ytc_Ugx_N3Y1C…
Comment
- The car wasn't speeding. It was a 45mph zone as clarified by police: http://www.phoenixnewtimes.com/news/cops-uber-self-driving-car-didnt-brake-much-in-arizona-pedestrian-death-10247629
- Dark? Radar, Lidar.... There is absolute no reason 'dark' is a factor here accept the technology doesn't work
- The car didn't break at all. How does a person pushing a bike not trigger the breaks? Half of the bike should be on the street before the person is. A person pushing a bike is not super fast either. Should't a computer with Lidar/Radar sensors be able to trigger breaks here?
I for one will wait for the official investigation.
I agree that a person would probably not have avoided the accident here. But to me, it seems like the technology still has failed. This should have been a poster card example for why autonomous cars with sensors are better than human drivers.... it wasn't
youtube
2018-03-21T16:0…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyBwr53QSrFwsgZ6Fx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzFdYKkc3EPrgIw4QB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxyqDNXW822gUmjenp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwILlhd8deTJLVCDhV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyAX9tJXO7m_YMoOZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmTB7eP-BaIKS3dZl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyv__Up7HC5I2xbfa14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdjWT7OaG50E2OsJR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw0hEIGcJirB1lT3Ut4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw92ZW-Q11YB0JOI9Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]