Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I saw we stop with AI and just stay with robots I don't think we fully need AI…
ytc_UgypvY4yk…
G
I'm with you on most of that, but converting to a "worker-owned economy" is stil…
ytr_Ugy9kU7Sh…
G
AI people are idiots for inventing and pushing this. Money is the root of all ev…
ytc_UgyTZyttw…
G
Can’t blame her. Tbh with all the concurrency on the field of art you feel impor…
ytc_Ugym9YohU…
G
@itcouldbelupus2842 Idk about AI art, but saying AI steals from real artists is …
ytr_UgyouV4NI…
G
Modern art is the art of doing as little effort as possible to exploit the rich.…
ytc_UgwyuVY-r…
G
I think, generally, the art aesthetic will enable artists to create more, as is …
ytc_UgwOdbmNW…
G
If A.i are given rights then everything deserves rights. Either everything has r…
ytc_UgwY3N-4W…
Comment
I'm a manufacturing automated technical engineer in robotics and lasers,,programing AI isn't smarter than a person !it's not paranormal !
the programing of the person that wrote the instructions is what it does, if it does evil , then it was programed to do that in that event or chat . period ! If it does do evil things then it's the programer ,! It won't become conscience it will only do what the program told it to do unless the devil possessed it so that's the only way but i the devil could possess a self driving car or drone ect. .. but the programer is the one who should be procecuted and if it gets hacked then again the programers didn't put out a safe enough product therefore procecuted
youtube
AI Governance
2025-12-30T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyltDVXB7yD7fD6ESd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxHXsFLDQSHgb5O4HN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEhbW3lP-PEuteNPN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxkRh6updi3B5sLXIN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwlJh47ajiZu65Tel14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgziIzic_is2D4zvnyx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwwIa_SiXDWrlDt7H54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyeDVQS48mJtRqIMmB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw4hnUDxnDmBR1RDQd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwk1JncAmVy_q2Y3XR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]