Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"I, Robot" is a science fiction book by Isaac Asimov that explores themes relate…
ytr_UgwB8NKua…
G
..... Okay, if you are a commentary/opinion channel that criticizes AI to a cert…
ytc_UgyIzmvMb…
G
I have heard before that LIDAR is important to make a car FSD but never have the…
ytc_Ugzkbi48j…
G
I repeatedly get the feeling that The Paperclip Factory should be mandatory stud…
ytc_UgxgThNi4…
G
Yes, I agree, why did my paid version have glitches? It was instrumental in help…
ytc_UgwlUv5tb…
G
@RedQuill13 can you write a new play using Shakespearean styles and imagery? Be…
ytr_UgwSswaQ7…
G
Don’t take all of the jobs from people for AI. We need more human interaction. J…
ytc_UgzzJ8IoD…
G
Us artist's have already gone through and continue to face AI taking over who we…
ytc_UgwLhmFDv…
Comment
I like to have fun with these driverless cars when I'm buzzed and going from one bar to another. If you step in front of one, it'll start asking you to move, it'll tell you that you're delaying it, that it needs to move, that you're obstructing the way, etc., etc. After about 5 minutes it gets more aggressive and starts telling you that it's gonna contact the police, that the police are on their way, and so on. The longest I've stood in front of one was about 10 minutes. I'm trying to see how long it'll take before it starts to bump me out of the way.
youtube
AI Harm Incident
2025-02-02T20:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzGgqPJD0aSnTBfO1N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzb3EBK5OtMQv5_L-l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrwVFwy18t0t9hRKZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUKlw_KwU4wNxJlQh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhULyBU8Lu6TnCtLJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwuZbAzmXfkP_pRst4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGZ0lT4ptSY-jgBCt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAWgWkUXY6vaZvef14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNf6ZOq7tl3lWGcW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRMAMofn8Z-DniLYZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]