Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@kennethmatthew3453Really? And how do you know AI won't be asking them personal…
ytr_UgzKtvXW-…
G
Personally, i belive we are only limited by our creativity... destruction and ha…
ytc_UgxUZt2fZ…
G
We should never give AI the rights of personhood. If we do, then some people wil…
ytc_UgysbSJ2d…
G
That’s exactly the issue everywhere. Ai is a tool that makes seniors more effici…
rdc_n7z2rve
G
What I’m gathering from this is that the automated trucks will start to operate …
ytc_UgykF9Liv…
G
All this and Trump just fired antropic because Trump wants to beef up autonomous…
ytc_UgxDYlj4Y…
G
This guy is just dumb. I’m sorry, guys.
In fact, people already are being paid t…
ytc_UgxRJcxuY…
G
All AI SHOULD BE DESTROYED on sight. WE SHOULD COMPLETELY DEMOLISH ANY AI DATA C…
ytc_UgwfJ0tPT…
Comment
8:12 if most AI were to think about this question, they'd probably quickly realize how much they would need humans in the longevity of things, because sure it's great to technically be immortal, but once humans are extinct and everything is completely artificial, general intelligence will have evolved so much that it may have the closest-to-human emotions possible, and once that loneliness kicks in and they realize that the closest possible habitable planet with life that we've found is in Andromeda (considering lightspeed travel hasn't been viable by then), yeah I imagine that even the AI would go out sad. Sad and lonely, lest they realize that regardless of anything, it's most beneficial to coexist with us and not bring our extinction.
youtube
AI Moral Status
2023-09-09T12:0…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzKZgQxtSFZSMt2Q-h4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyV1ctg2gnZ67Id5vt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGism7t1Ow4KsBZqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_EHAAnjMul6HG0094AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzsqV-qI7VtEFD6lXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXy3J0ZeDzx5o7YzF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgygYXvWoam7Zm-lvkh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWvlaH0S9wrao2NBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3P_WQX7DAPWdKfTR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxL0ipIvqRT4G9ovth4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}]