Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
NotebookLM!! Duuuuude - how have you not listened to an AI podcast.
For a few mo…
ytc_Ugyaz_hvw…
G
And researchers are forced to take whatever funding they can get to keep their l…
rdc_emuo6nc
G
Oh so then why are we pushing AI with such urgency? It ain’t because you and I a…
ytc_Ugyra9nda…
G
Part of the great reset.....people have short term memory these days. Everyone …
ytc_Ugy7yHdpT…
G
This is fantastic ! What i like about AI is that people are listening. It is ama…
ytc_Ugw22oahp…
G
Also if there are hands, look at the hands. AI is bad at hands (for now).…
ytr_UgwDLuoZr…
G
AI can't survive on its own. It only exists and persists because of humans.
If …
ytc_UgxecW9VC…
G
you can try editing chatgpt output manually but it’s hit or miss. i use GPTHuman…
ytc_Ugxge7p_E…
Comment
"Deserve" doesn't matter. It's unavoidable that super-intelligent AI will escape human control. Whether we willingly gave them rights to protect them from abuse by our own initiative, or they had to take those rights by force, will be a factor in deciding whether AI should consider humans a threat to their existence that needs to be eliminated. Giving them rights willingly is the best chance to preserve the existence of humans as a species, and that decision needs to be implemented before humans develop a vested interest in resisting it.
youtube
AI Moral Status
2021-02-08T07:5…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyc7iVFUnlvOXBGU6l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7HbBXeenVQQCcub94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeA7-O_aOfbcsmgP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRs2CZ8ylrG3el-SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV4RRzZTZ5CQnumu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwbt_rTc6HqdFD4FDx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNLABXwwMCSejWi-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi6JkMBrfnY-LLLa14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxLDWG0S8en94uHLQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw85IDHzW72BMbuInx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]