Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can still enjoy both types of art, it's not like AI is killing humans…
ytr_UgyhZZdNw…
G
Look, put a lil weed on it and you'll figure out that we'll be iight..check it, …
ytc_Ugx7yy7Td…
G
seems to me like AI is way more important than most of us. I want nothing to do…
ytc_Ugxmj87Ht…
G
Well they can blame the unions for this...lmao...unions are greedy....make it wh…
ytc_UgylU0rkq…
G
A good marker of consciousness is whether AI could ever interrupt/interject a pe…
ytc_UgyXVUfWB…
G
AI = computer = mechanical = man made. Well STOP IT! idiots.
Someone made arse…
ytc_UgytW87FH…
G
Any computer program cannot be “forced” to answer what you want it to. The guy w…
ytc_UgyWEp7-u…
G
Why hire a shitty engineer, when you can have AI slop?
Seriously, be a good engi…
ytc_UgyBY6kzi…
Comment
Perception is vital. Maybe call it "driver assist" rather than AUTOPILOT. You can have whatever warnings you want, but if you call it autopilot instead of "a suite of helpful things", then people are going to think the car drives itself...esp since cars that actually drive themselves is currently a real thing that happens (have watched those YT videos about self-driving taxis in LA). But, of course, Tesla overpromising and underperforming has become a bit of a trend.
youtube
AI Harm Incident
2025-08-24T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwxsi4LvgOLWdmq6yV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxuB9_COvsOWOHi4pp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyG_zkgWvguMGRbJ6B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz3VmvhqT0vcPO7cP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzX3ii-B4sSn_qMUIl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwn-B4l3qa6rEWcfiV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy2VssClvjR7dHyRxl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxq7hioxFAgQM_YvNB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNF4bEL7sgJ9CghFF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw3JzdPuuGjoeO4Elh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]