Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
22:09 If there's one thing I can complain about the artists that oppose AI slop,…
ytc_UgxSaTjxF…
G
As soon as Dan O’Dowd appears in a self driving car video you can just call that…
ytc_UgyCSKU1-…
G
I sometimes use AI to create a baseline and then spend a good 3-6 hours manually…
ytc_UgzTF2UTs…
G
I worked on a self-driving car project. My manager believed that the self-drivin…
ytc_Ugyvj_Ecl…
G
Deep fakes will never go away. As long as we have freedom of expression and comp…
ytc_UgwzuXEON…
G
@thespirai- Why would someone who is untalented in drawing a picture use a pain…
ytr_UgxAQa253…
G
God is real Jesus is God and his spirit is in me and without this belief I would…
ytc_UgyywptlH…
G
Here we go AI over Humans is already starting not long before they have same rig…
ytc_UgwmrZeAx…
Comment
on giving robots negative "stimulus" to coerce them to work for economic profit:
wouldn't it be much more efficient, and thusly more cost effective, to make a robot that just does work, rather than one that must be convinced to work? besides, you said the only real reason we'd end up with AIs that feel unpleasant emotions is if we make an AI that was capable of making AIs more complex than itself, so why would we use this "inhumane" AI for our labor operations anyway?
youtube
AI Moral Status
2017-02-24T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghyXzu2XC_913gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgivGeenbgAVsHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj_4LAWchwUNHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjOZFi2KQgtF3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggnIwBEucuEIngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf7zVJ7GJbHHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiC4plFAWxImHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UggzbpDGUt7ibHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggFuDC5x01ktHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgjFdWWtlSXv_XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]