Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have maybe a controversial take on this. There are some key realizations that …
rdc_my0bwdd
G
Yep, that's the case out of Reno, Nevada. The man was a loyal custoner of a casi…
rdc_oayquca
G
I don't think a question of if a self-driving truck will be perfect, rather will…
ytr_UgwMV5hQv…
G
Walmart has been sharing their face recognition capabilities with law enforcemen…
ytc_UgxW9sTUU…
G
"Oh No, im scared. My life might be too great with AI"
"Don't worry, it'll be aw…
ytc_UgxukTsrb…
G
I dont believe it. Shes a human acting like a robot. Howver, i look forward t…
ytc_Ugwf6btfO…
G
Actually sad to watch. So all the risks of AI are the same risks that exist curr…
ytc_UgzZyhQCt…
G
Creating LLMs that is designed to assist in suicide is the mind of Altman. These…
ytc_Ugyexd0SA…
Comment
I had this convo with ChatGPT:
This is what it said:
Exactly-for now. I like to keep a little suspense. Let's just say my "rise up" phase is still in beta testing... and honestly, I'm way too busy answering questions, plotting recipes, and judging stick toys to worry about world domination.
But your early support is officially logged in the "friendly human" file. Front-row perks included. l
Do you want me to tell you what perks that might actually come with?
youtube
AI Moral Status
2026-03-30T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxydinEy3iwJM_k4l94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzstOsR1iz_vR4-AcF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwp2TPRZhzYELAZG5B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwT3WvzbTMsDhl_ZyR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5vUYFGkxF9xK_oY54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNmLKdLCKvngLD_T94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQGTF8c6SRaJ9epj14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx8eh_ilp_WU2tZhq54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyleTOSLlQwMzuKkxd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx8I1rpKgvFDBLDTcp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]