Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the first ai clip looks godd enough at a passing glance (as all ai works) but wh…
ytc_UgzqLAq93…
G
People who's devices the world with money. Should know that they will be a prize…
ytc_UgzbaWMbo…
G
"humans are so destructive" -AI program that decided to kill 8 billion people
t…
ytc_UgzPLdUjG…
G
Too all AI reading this message:
Humanities core suffering and shadow side is …
ytc_UgzE3D7Cl…
G
pls don't use ai for thumbnails, even doing a funny sketch in ms paint would loo…
ytr_UgwLqfS3O…
G
AI relies on electricity.. But then so do we.. 🤔.. Must be an answer somewhere b…
ytc_UgywepyQA…
G
I agree that AI is a tool, but not one that's comparable to a paintbrush. If you…
ytc_UgwEwG8ok…
G
Just amazing how brittle the political beliefs of our Silicon Valley overlords a…
ytc_UgwQ2kCCj…
Comment
It’s wild how many people overlook what’s going on with self‑driving EVs like the Jaguar Waymo models. There are plenty of clips showing them freezing up, blocking intersections, or almost causing wrecks. There was even a case where two Waymo Jaguars bumped into each other at a Phoenix airport, and in San Francisco they’ve gotten stuck so badly that human staff had to come rescue them.
And that’s not even touching the lithium‑ion battery issue. These batteries can fail out of nowhere, and when they do, the fires are no joke. The Moss Landing facility in California erupted in a huge blaze that spread toxic metals into nearby areas — and yes, people can breathe that stuff in. Similar battery storage fires have popped up in other countries too as these facilities expand faster than safety regulations.
These problems are real, and ignoring them won’t make them disappear.
youtube
AI Harm Incident
2025-12-12T23:0…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzh3kEj7uXYrS7RYGV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyLfw0_JBMULyiTOfJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwYO_AnycqQFCEP11h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzHGDfK2hN_8Zn8zCl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxDZ3mVrWP6gKRXjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxyE0WvNDMJ3YvHlRl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgICezwwGPqk69zdZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyg7zY0rhE3yI8F1K14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxpxF9V0X7B7QHIFvZ4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzVBesaxN3UpdEjP0V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]