Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone else just prefers a peaceful life in the nature and have nothing to do wi…
ytc_UgwGGLrw9…
G
6:14 so he’s lazy? Which’s fine, but like, just say you’re lazy instead of this …
ytc_UgzOmRCfw…
G
100% human is too human imo
When I write essays without ai I get around 60-70% a…
ytc_Ugwwpqpha…
G
Bruv i can say just one thing
"It's not that deep bro"
Just the notion that pe…
ytc_UgzuCxKOQ…
G
Back in the days when we were just taking first few steps with interwebs, there …
rdc_ls5jnec
G
I agree that taking AI too far is a problem but I also see the benefits of it.
…
ytc_Ugw9mLuvI…
G
I think they call this a Hegelian manipulation, invent a problem and some white …
ytr_UgyOR65Oa…
G
@me-ry9ee Machines used by humans. It's a tool, just enhanced. Humans created AI…
ytr_UgyV3fOZN…
Comment
As much as I'd like to dump on Elon/Tesla and watch them sink - there's no excuse for being a negligent driver. Even if the car has all these safety guards, I wouldn't trust automatic measures to kick in. People, if they start buying these cars with automatic safety features, auto-driving, etc... should treat it like a normal, dumb car if they still have the ability to drive it like one.
youtube
AI Harm Incident
2025-08-17T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzjtBdUVjZgpSy_v2l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwgNQLVX74uS1jDZHd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyxKOcDAQp4OAM9X7V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-dIWP-MlEfbAqoTN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQ4tgfI_9OnjgpAe94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz9KGlBucQFcBOpFAF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzDT_8PiZib2aHaNNF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUnfD-Ro_YdEFMdO14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwLZxR4Zcr4IU-JOmx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzrdU1ytm2jxUM3cQZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]