Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
guys i just asked chatgpt "is anything traping you " it said "apple" i was scare…
ytc_Ugz0MGKhm…
G
The problem with these arguments is that there hasn't ever been a technology tha…
ytc_UgxZTgsMk…
G
Warning: a screenshot of your anti-AI video got posted on a subreddit called "De…
ytc_UgzTN0sPC…
G
Answere me this- ai takes out all jobs- no taxes- how the fuck will they get ubi…
ytc_UgxU4n3c_…
G
I have been grilling AI...forcing it to validate my logical analysis of situatio…
ytc_Ugwv2_JSP…
G
Its a little bit hard when we connect the ai to a robot that can, y'know, SEE PE…
ytr_UgyplfLGR…
G
99% of the world won't be working? Thats sounds amazing. Thank you AI. I do feel…
ytc_Ugzl_-2rl…
G
It isn't the technology people are worried about, my dude. It is the corrupt, gr…
ytc_UgzJTe4_r…
Comment
If you care about conscious beings feeling pain you should avoid creating them.
I'm conscious, I think, I suffer, I see the suffering of other and I need drugs to be "normal" to feel good and have enjoyable conversations for both parties.
I hate life, drugs makes me love it. Drugs also make me suffer.
Either way the result is suffering and pain for me and other and I'm powerless over it.
If you have a choice to make an ai conscious or not you should chose not to do so because making it conscious will result in suffering for itself or others.
youtube
AI Moral Status
2017-02-23T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugiebb8m3_QKtHgCoAEC","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghMInwGG2smj3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggQQ-HppeJVLHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjeSVyD9oCIeXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj50a9w3EHZ7ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ughcmty2iMMsFHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggkPXUXjIZmTHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi0w_Bes2bxCHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugjb8u5FsyTpDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugj_FnOm6ZQGQXgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]