Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing I think is important to remember is that these AI tools function to ba…
ytc_UgzRdQapm…
G
Saturday, June 07, 2025 . . . Greetings, everyone. I am discovering that everyth…
ytc_UgwEcLm0l…
G
Ok… so who is accountable?
Cities pay out the nose for AI systems that don’t wo…
rdc_oa4oe8u
G
For a better profit margin they outsource jobs , now AI do it with more Marginal…
ytc_UgynZBrQt…
G
I don't agree just with some parts. First of all totally new ideas and complex f…
ytc_Ugz5si41l…
G
Finland tried Robot Shop that was run by AI. It ran year. It was disaster. Peopl…
ytr_Ugw8ZtYQg…
G
The key thing here is that I don't care whatsoever if something I've said is use…
ytc_Ugz93kNKw…
G
I get the hype around building software with AI, but from experience, it’s not p…
ytc_Ugy0TSMAh…
Comment
This is tragic and awful. It has tried to kill me a few times. Once almost got me a ticket for running a red light. It 'knows' it's about to turn green and it's done this before but this light was extra late turning but the car gave no fs. Started to drive out into the intersection and I had to stop it. It's sad. People rely on it too much. NEVER trust it. It's a robot. Like a smart toaster with wheels. Seeing these accidents they could have been wildly avoided in any other car. The PERSON DRIVING is responsible. Period. The worst company in the world to say they couldn't find the crash data. They should go down for that 10000%
youtube
AI Harm Incident
2025-09-01T21:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw0_VbvyPjzjsKF-Rd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVnq8SEkY0qLk1S914AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8oXcLC6Dpkhtq9YV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugze3rcWJ49Ll6xX7Gl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4oppkbh3iG8s-SMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIQlMJBJkGzZwsjU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYk1ZPlOuBbYGdrjh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyePPdVDK5nqbdKFhJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwPgTC1HjbyQk196dl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzrIGwjTB3LjBlb1mt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]