Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Replit was a useful tool for learning web app components and how they all work t…
ytc_UgySshx2L…
G
If we make new laws and regulations we can get rid big data centers, robots, Tru…
ytc_Ugy7119a-…
G
haha...is the ai acting different? or have trained it to act different when we t…
ytc_UgzJAO6Gf…
G
personally i use ai to make stock images and i check to make sure its not stolen…
ytc_Ugy-L9NHr…
G
"Adapt, or get left behind."
Ai is happening. There is no stopping agi. The bes…
ytc_UgzNOMhia…
G
Disgusting you're not reminding me of a movie never movie about predictive polic…
ytc_UgxdMadz0…
G
Wait till it gets to heavy robot goes crazy back to work. Pickers stowers no cha…
ytc_UgyVA8yVr…
G
My argument has always been "this is the worst AI art will be". People don't get…
ytc_UgyE3P6jQ…
Comment
Wow. What a pleasure to be a fly on THIS wall.
Just a fascinating conversation and yes, I was talking back to my phone the entire time. So I guess I was also participating!
It’s hard to remain positive on this subject, particularly when at the end the discussion turned to the “value” of humans deciding how we might best coexist with AI.
My best analogy for this is a similar struggle with how humankind is deciding how to coexist with nature. And it ain’t looking so good. 😔
Thanks, Neil for putting this kind of real content in front of us. THAT does give me hope.
youtube
AI Moral Status
2026-03-07T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzs9zSfS5uGXFVUA6d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzK71h7YHSgb0HUE3x4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX-cL4oJjigJgSeIh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyEl2Od0pjUIemK4_t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx5FXsnRx8eHGqMrhJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwD3nzUI__8xdc-2WB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxr4f7PkSQxQXGsGH54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2LZm1LvQt356Lxrt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFHzFrHqXnPiEZD0B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzg10KRr6shQAsW7t94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}
]