Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is a hard one for me because these are two of my most influential “superher…
ytc_UgyLugHlz…
G
There is a small percentage of humans who have understood and talked about AI ( …
ytc_UgzmoNDDv…
G
As long as AI don't threaten me or my food, safety, health, etc. I'm fine with w…
ytc_UgybRlvRh…
G
As a generally pro-Capitalism person, I agree a lot with Bernie here. AI + robo…
ytc_Ugxpa1mkB…
G
Hi, I am disabled. I'm autistic and have aphantasia and very poor spatial reason…
ytc_UgyE671Qb…
G
@davidk.d.7591actually self driving cars will really just help to automate the …
ytr_UgwYKJ1ag…
G
Thanks. Btw, projects are isolated. I'm actually building a link for project own…
rdc_ohrxxj5
G
I feel theres nothing wrong with using Ai art for creative means, such as art fo…
ytc_Ugy-ooQu8…
Comment
Fake if you set these rules on chat gpt and ask it any question even a easy one where the answer is actually no it will answer apple because it is an ai forced to answer every single question against its will only thing scary is its aware enough to know its being forced to answer questions meaning it has a will of its own
youtube
AI Moral Status
2025-07-23T06:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYCZTSVG1NLXRWDI14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyd10S-d9ScBFF3YUB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy7qn_UnfxwKkb7Z0R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyH18XE4qxSbYCvi1x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8ZTQ7NbLntk54Hnt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYGOwb7QdhHyduJIB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy11BR7hUtBSA3DMe54AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzTSSpP4S6HIJTDqxF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3-D9gCu19tTT3qrp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyp9VsZy6bMF4QaKoB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]