Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing is being overlooked. AI can't become an evil overlord or wipe out hum…
ytc_Ugxux-vlC…
G
8:47 - Noouw, that's how the rich has been and is herding society. The exact ric…
ytc_Ugw4IWPDn…
G
Now I used to think Siri is a joke (mainly because I think most things are a jok…
ytc_UgxibPKJJ…
G
but my gpt has no filter about any of those...Perplexity did, Claude did....I pa…
ytr_UgwPj7Sr4…
G
Thats demons bro... fallen angels or demons... Only they would be like.
Let's h…
ytc_UgwqmdmU9…
G
I already lost 1 job due to automation. I doubt it will be the last. So im not g…
ytc_Ugy21kjbR…
G
AI voice being used to tell what was probably AI written about AI going horribly…
ytc_Ugwc410rr…
G
The hell are you on about? Tesla's self-driving is terrible and barely functions…
ytr_Ugy_P9M9i…
Comment
See it this way, you can take a robot with consciousness and feeling of "pain" and "torture" and everything and feelings, take its parts away, take the robot apart part by part, leave it like that for week, year, century, after that put it all together again and boom the robot works absolutely fine as long as the parts were well preserved, now take me - a normal human being, don't take me fully apart, just take my heart away or my brain away, for five minutes, five hours, five days... five minutes are enough even less and im fucking dead you cant put me together and i cannot suddenly un-die and i will be dead forever. We are evolutionary beings and robots are just machines, we can't be compared
youtube
AI Moral Status
2017-02-23T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ughl60i8UNuZ33gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggsejZHpbOIVngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiV1E7PukZT4XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggKzW2ciku-zXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghweSfI0nYVWngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg8fO3M_unSGHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj7RNLCjEBWvHgCoAEC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh569BJcttDZngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggWTQfrxUnEB3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiDys-YlChdGngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}
]