Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No wonder the stock market is worried about the AI bubble. AI is doing blatantly…
ytc_Ugzw6Wg_J…
G
I asked ChatGPT for some react script , just a simple component scaffolding, an…
ytc_UgzGspZaX…
G
I appreciate your honesty! If you have specific feedback about what didn't reson…
ytr_UgxR_oG6j…
G
Homie basically said Ai is doing a Kage bushin no jitsu, and then training and g…
ytc_UgxD75ctu…
G
Lmao. AI slip talking about AI slop. I’ve replaced 80 of my workforce with AI, i…
ytc_Ugz-oEEwf…
G
You don’t need money………you need an exchange system……we just use money as that sy…
ytc_Ugzo2wkta…
G
I say we use Isaac Asimov's three laws of robotics for all AI moving forward, ma…
ytc_Ugyyn5xWQ…
G
at this point, stick figures are more creative than AI art
at least someone drew…
ytc_UgyFwE71j…
Comment
I agree that the technology isn't perfect and that it shouldn't be promoted as "autopilot". However, focusing only on its failures is also unfair. How many times has it prevented an accident that would've occurred with a human at the controls? Seatbelts and airbags routinely cause injuries and have killed in the past, yet we still keep them. Why? Because, overall, they improve survivability in a crash. If self-driving cars prevent more deaths and injuries than they cause/allow, then they are a net benefit.
youtube
AI Harm Incident
2024-12-22T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyFO-9eC5zlG_hNGah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy6yOKkwiyREQn9Tcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweJGnMI6_EBDZwdvp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzr6JIuyym3nDLWXFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXXQuF4QbXVXAx_Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwmE1kGedyjXd6UD0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-2IfQ7P4otztnO454AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzoyb7wAsOqXml2Ffx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZCHcUzzyd5UhZgIB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7-6-aXZv1BjShoNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]