Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@avarmauk I like this discussion, but I have a question. How do you know I'm not…
ytr_UgyTiJIrw…
G
Lots of people talk about susite to test limits etc doesn't mean anything I've d…
ytc_UgxT7a04K…
G
A.I. has potential to increase it's intelligence at a rate and with intellect be…
ytc_UgxZyWYU1…
G
It's not fully replacing the job it's just making it more efficient so that mean…
ytr_UgxXtu2Aw…
G
Bill Maher call it "ass kissing AI", which always answer yes to your idea no mat…
ytc_Ugw9R337H…
G
لااله الا الله وحده لا شريك له له الملك وله الحمد وهو على كل شيء قدير…
ytc_Ugw0tKsxo…
G
AI doesn't see solutions to problems as you do. AI views the problems to be the …
ytc_Ugy4G55Xs…
G
They're actually slowly putting creating art behind a paywall and telling you th…
ytc_UgwncFZLK…
Comment
"we are really in danger when the AIs have their own infrastructure"
so what I'm hearing is if chatgpt or grok tells you that you should put your datacenters in space, you should absolutely not listen to it.
youtube
AI Moral Status
2025-10-31T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxulKGJi86wcT0kDzF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxx2bURL3blvxVxZQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwR7p97wVP-tPwq17p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpdD3iI1J9uBK_b0B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwk184PxRN3wdcOYzt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxir1tqHkzbkDm6jLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNVOH9G5701G7oaQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd0s0yoZMjnfOv6QN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUuGtUdClySVisWrF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzu3eh73nscrGi7bxN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]