Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
overalll I understand the problem of the whole deepfake but overall this is so d…
ytc_UgxHl1o5x…
G
AI sucks AF in every way. I'm Not an Artist but programmer and i hate this stuff…
ytc_UgxI3mabX…
G
Dave, thank you for lending your voice to this issue. Sometimes, I think it is …
ytc_Ugx25xeB6…
G
@rj-wz7do There are multiple beasts.
"The second beast was given power to give …
ytr_UgwlCL8EH…
G
Okay, call me an asshole, a sexist, mysogynist, whatever, but i this rhat that r…
ytc_UgyAhSs8f…
G
You do realize, right, that that would be as easy to include in a deepfake as th…
ytr_UgxoyQapb…
G
I'm starting as an artist but I'm also a computer science major so I'm in betwee…
ytc_UgyhMoJj9…
G
We do not have years anymore, anyone who has talked to AI these days knows…
ytc_UgwO8XDRw…
Comment
you have to be really nuts to let the car drive, while you inside.
This is wrong anyway. We cannot not let a "program" drive a car, but we can let a robot that can trully think by itself and not coded (maybe wait another 200 years or more) to make the decisions for driving. We can teach a robot that has a brain, to drive a car. Totally different concept.
But today for the moment, AI means Advanced Intelligence, it is all coded to do something, it is not real thinking.
Until a machine can really think lets just don't let a computer drive your car. Putting people live's in danger
youtube
2025-10-03T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyAstHA0eb-zouY8S54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyZhUjQH5B00MQKry94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5VbCXkEHIBH7BBZd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzV7QdlcBx9xeTOMBJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgytPC62WmYkdXa70U54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRNofsIkixypxMtCV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMoVONYjQzSUuaF8J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz_7zJaljXYZd9b_3t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy3cSutW1MGy8aeCiN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzpuGiREhmcfmMLUQh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]