Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Which customers will buy your products and services once they have mostly been r…
ytc_Ugxdq6__a…
G
Do yall think DeNiro cant go ahead and set that up himself? They didnt stop the …
rdc_k9j2beb
G
5:00 - “So, what remains?”
HUSTLE - that’s what remains. People who work hard w…
ytc_UgyBcnc9q…
G
Isn't it more likely that AGI would be a service that could be purchased. For ex…
ytc_UgxBFHv6g…
G
I dont know if it is just me but looking at ai generated content gives me massiv…
ytc_Ugz2lhVRw…
G
I’m assuming it would greater impact men since the target demographic probably s…
rdc_lzaudj1
G
So, what you’re saying is this AI is like the AI on Person of Interest in its ab…
ytc_UgwQRokMN…
G
We already have an example of a technology which is ruining people's lives: vide…
ytc_UgwC2MKvr…
Comment
this episode dsnt make sense. robots only feel emotions if they are programmed to do it. so the trick would obviously be programming a robot to be extremely happy when working and really sad when not productive (different programmings could change from robot to robot obviously). in a long future, everything would be done by self motivated robots. they would seek their own happiness working for us and by doing so our comfort would be total with no need to do anything at all, just enjoying life free
youtube
AI Moral Status
2017-03-01T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg9Dqny3LoDQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugjl892grkD1CHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjusG2XXNsQ8ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjdXJQpASsKnXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UggFqHDoWRfrsXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggRQk_shtKMS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiU0CbkUs7EXngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ughae_Q7RxIYQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Uggczad5RakHtngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgilhY784SZqgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]