Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The companies that laid off employees replaced by AI must pay the price the most…
ytc_UgxbUtdN7…
G
Ngl I’d absolutely blame ai+the way ai is viewed as the main causes
Idk the ch…
ytc_UgyW5hfPE…
G
So I know this is probably gonna sound stupid but I'd be curious to hear any of …
ytc_Ugxrw3g8W…
G
In the EU region, Europe, copyright laws were updated a few years ago to take in…
ytc_Ugz2iy-CE…
G
Been there a few months ago. Great song. Tried to research about the artist. Zer…
ytc_UgyBxqIyW…
G
Artists should have the RIGHT to SUE any company or person for stealing their ar…
ytc_Ugy03RxR9…
G
and jobs that it won't? You didn't mention anything about jobs that are AI proof…
ytc_UgwKO3MDX…
G
@whynotcode how you can say that and then glaze AI in the same breath is beyond…
ytr_UgzrHCnSi…
Comment
Thinking about it robot driven cars that drive themselves would be much better. They would follow the rules of the road and communicate with each other via a set of commands on how to travel. I could imagine we could probably make an AI right now that can react to weather conditions. The have better reaction times, you can use different sensors like infared to see deer or hard to see animals. It would improve things a lot. The driver cannot get angry, do something stupid, be distracted, fall asleep, have a seizure, read, text, or do other things that cause humans to ram into each other while driving. I would imagine when this happens people are going to flip out because you won't be able to have human drivers on the road because they are a variable the computer will not understand.
youtube
AI Harm Incident
2014-05-26T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggClE0QGTufbHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghVv_MI-gLHDHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjsxqmpUtf7YXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg28UmkRD2tpngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugie-lUnu0GZ63gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggrRnRLKHDtQngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggs_LmUfAdFeXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjxPGfWudcBB3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghiDZg6vHR7nngCoAEC","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjmBLzPv7AehXgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}]