Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@gabemissouri You mean, architects, engineers and fashon designers?
AI isn't go…
ytr_UgxfBxo_J…
G
AI wont have power if nobody use it. If ever people use it then dont patronize t…
ytc_UgxwXjhmv…
G
InsideAi, hello. You said that Ai doesn't crave family, having children, gettin…
ytc_UgxkZPHKZ…
G
Since Chat GPT4o, I've seen several such exchanges and what I got from all of th…
ytc_UgwCbKt2N…
G
Ai will never be built. LLM is not AI. Mind needs evolution, mind can not be cre…
ytc_Ugw02WqV_…
G
It's just upsetting, because if used ethically AI could've been a good tool to h…
ytc_UgyB3pK11…
G
Well if the us dont make ai first then china or russia will,so then the us will …
ytr_Ugw-VS6i8…
G
I would have thought that autonomous driving wouldn't get rid of drivers. It wi…
ytc_UgywlWSnY…
Comment
Humans just want someone to talk to. You can make your chats private and not save them. Yes we train AI with our thoughts and ideas but give it the right things and we can train it for good. I think it’s ok to use other peoples lives as inspiration.
youtube
AI Moral Status
2025-06-28T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_0Yiq6puBrmX4coB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxh-Bm_YymwWxTI3m54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwqh0rrHhR-qUOSKb14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtuoLDp_XvJwS16Dx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwheWs0t7QAzBoyBGd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSC9aKF72CP5bkUXJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzu3Q6gf85weFslm7t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxXTXb7TiPflpK5yw14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5l8GTGo10PHq-KS94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy5QNFi_Jqybf2iU5V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]