Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1 of 10000 ai generated pictures are just "ok", not even brilliant, other 9999 a…
ytr_UgyrpehYU…
G
@TheJochanceSure. But with more vibe coding by juniors without understanding wh…
ytr_UgzxFykbl…
G
art will survive AI, perhaps only as a profession for fewer people, but the purp…
ytc_UgwA6pMtB…
G
I’m sorry but this doesn’t make sense to me! With Neuroweaponry, the System can …
ytc_UgzqYe3st…
G
Well it's not like we're regulating either the usage or technology of deepfakes …
ytr_UgyYQbMhx…
G
the app couldn't read the road ahead :))) this guy is stupid :)) if you look on …
ytc_UgyxUWicm…
G
If I can read content to learn, so can an AI. If someone asked me to re-write an…
ytc_UgwtMDxGh…
G
My lord, humans are truly our own worst enemy. Creating shit that can kill the s…
ytc_UgxQsqHq-…
Comment
I seen the video and the Uber employee was looking at her phone. You can't rely on humans as backup to the car. Who was the brilliant idiot who decided not to put infrared cameras on the car? Doesn't this car have radar? Human eyes are very good at detecting movement. It is entirely possible that had a human driver been driving they would have seen the cyclist. The human eye can see things many cameras can not. The Uber employee didn't have her hands anywhere near the wheel. I real human driver even if they could only see what the camera could would have swerved potentially missing the cyclist. The real problem with self driving cars is maintenance. I am sure all self driving cars are maintained meticulously. The vast majority of cars on the road are not. I been around computers for way to long not to know they can fail in so many unexpected ways. I am not a fan of the self driving craze.
youtube
2018-03-24T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_cYfqNU3aDhdPlt14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwHuKYnGxVVo6rFHhx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9rO2wNOT-ikBr4B14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzojAAVa671GkDhr9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKQyZyilGfh0x72Gt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgX2RvjccjA-YPstx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJ9c89mNiilIm2Ybp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxEzCpiwo6VeTlj5jd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy63pl_ZifMbW-O96h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzyGRU8kjp5e7zZ_SJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]