Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if mr. musk chose copilot instead of autopilot then we wouldn't be having to dea…
ytc_UgxuYlWiA…
G
If you wanna poison ai, just create more ai art. The ai's will probably start u…
ytc_Ugwhn71NX…
G
13:27 it could be awful. Exactly why we should stop because no good can come of …
ytc_Ugzs3Za8z…
G
In a world where AI is worth trillions of $ to corporations AI safety is not a p…
ytc_UgwwQ2nx8…
G
Very interesting and somewhat alarming. One thing I disagree with is how Hinton …
ytc_UgwnI4QSk…
G
Can AI help humans understand differences in cultures & perceptions around the w…
ytc_UgzqYZ5ZD…
G
Sam is basically saying, AI will be replacing most of the jobs, driving the econ…
ytc_UgzUOTzHm…
G
As a nurse, I'm curious how I will be replaced with AI. I could see maybe the on…
ytc_UgyDzkmMw…
Comment
Those machines were used to make work easier, not to replace humans and turn all art and expression into soulless, generated slop.
You want humans to slave away doing work while "AI" replaces the pleasure of creation and takes way our reason for living.
youtube
Viral AI Reaction
2024-10-25T17:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx_CL99LdAHOq8nszB4AaABAg.AA0V8nd8fT9AA11Any6826","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwMvXuIW3J5vZmUhkR4AaABAg.AA0KAxbe0JBAAPUsbh3nFs","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzJxOlBT7FmvROkYWV4AaABAg.AA-rq-b0lOwAA0OJCwAmL_","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgxLShNGtYewRbgUAxF4AaABAg.A9zxk33MxMfAA-1aKYZH5_","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzR0kNTLzQy3DACb7J4AaABAg.A9zszaxb9T3AA-01WyeXY1","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxJAk7Lhmb38B_aMCl4AaABAg.A9zsBf3eiqSAA-0AvPQsub","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgxJAk7Lhmb38B_aMCl4AaABAg.A9zsBf3eiqSAA-JWkoqxPI","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgzsZFypfFW2DsPFJFJ4AaABAg.A9zr7LFuZLPAA73pfDNAZM","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzsZFypfFW2DsPFJFJ4AaABAg.A9zr7LFuZLPAA7zAaVORL7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzsZFypfFW2DsPFJFJ4AaABAg.A9zr7LFuZLPAA8djRF8nmv","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]