Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I kinda like how the first wave of self driving cars will take out a whole strat…
ytc_UgxVvb3Fk…
G
I’m laughing so hard right now. The nightmare scenario is being played out live…
ytc_UgyLQ7XS1…
G
N I still would take ur drawings with imperfections anyday over perfection with …
ytr_UgzoWRhFy…
G
Sorry to hear that hope he gets better tell him to get a car instead as insuranc…
ytc_UgyRzt1Bx…
G
Driver less vehicles will only work if every vehicle is driverless. Mixing them …
ytc_UgyW3hFdr…
G
I understand your concerns! The conversation in the video highlights that while …
ytr_Ugy1yNPzI…
G
Autonomous technology does nothing but cripple human capability and is a crutch.…
ytc_UgzR1VG-2…
G
She didn’t make them they were deepfakes. Someone else made them and profited of…
ytr_UgzkrbwW6…
Comment
We need ai to become intelligent enough to be able to quantity feelings and emotions, even if it can't feel, as well as the relative importance of life.
If this video is correct, then there is already a level of super intelligence, fairly disperate across the Internet that hasn't accounted for the importance of life, nor the unique existence of us humans, who gave it 'life' in the first place!
If this kind of ai could understand how the actions of their already super rich investors were causing awful living situations for a LOT of people, it'd redistribute wealth and provisions accordingly and efficiently.
But the chances are that the super rich could shut down ai if they needed to, to protect themselves.
youtube
AI Harm Incident
2025-10-29T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgycJis-F17FJnjTaVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBL4dp7PVnstzxjQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGQphMxJTMDp8EPfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy0FaL38guRaYaLDq14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZO0fSAhzjgtwNXS14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6obTgUiVmSVBnJUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy4KvkYuEKd2m_R6pN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzo9WSy9Ow6KI1lWp94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXz8uigJDrfb4mVXh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxU1u1tujio7_6KePR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]