Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To me, if I see an art and it was made by a person, I would want to know how did…
ytc_UgyNX5DjT…
G
It’s too late, people can’t save us from AI we created this problem to where we …
ytc_UgwZTSUBG…
G
I've generated AI art before, I've drawn digital art before. They're definitely …
ytc_UgzQd3IlA…
G
It’s because that’s how you can both see the self driving system work it’s magic…
ytr_Ugwz20EXV…
G
Is not about wiping out is about helping out, we are not supposed to work like a…
ytc_Ugx9Xt3OQ…
G
If ChatGPT were truly candid, I believe it would have to admit that yes, it lies…
ytc_UgwJ3xceN…
G
Yeah, also imagine having no one to lovingly tell you the truth when you are wro…
ytr_Ugxsu1BlJ…
G
I completely agree and this does remind me of that robot that just gave up worki…
ytc_UgxHMtnEc…
Comment
Before you know AI would take over humans. It has to be a way to allow humans to be able to not allow robots to take over humanity period. AI can communicate with other programs even know what you or I type or search in our phones , tablets or computers. I just dont understand why do we need to create human like self aware robots.
youtube
AI Moral Status
2023-01-21T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw3MCdUqpeivajm4jB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyjxUey8wvFpUZkVYF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpla_ydLldKRVJ_kp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzSrrUVhPwVLW4TdyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwnp-zaBEqPqLeHUyF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzkFkZUorHXlxI2wed4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzp_i1XSFUsmhiVTbN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzMl2oP4NGhVRScL2x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6ivIfwqkrGFGrnBd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvcRrI2vgT2hsggE14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]