Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find it really sad that these AI Image generators don't understand why we crea…
ytc_Ugzp5udgp…
G
The ethical issue is this attempt to use AI at all, to substitute for human judg…
ytc_UgxNyfGQY…
G
It gets me incredibly upset how everyone calls this stuff “ghibli style”. NO IT’…
ytc_UgyYgbSD4…
G
@CatLover-gk8uu That's not even what they're talking about. Many people generate…
ytr_UgxWLsuLW…
G
But actually people can use AI to do the things a plumber would do😂.then everyon…
ytc_Ugx2dQeYV…
G
You will need people to debug the AI code now. But, that’s assuming AI won’t ge…
ytc_UgwGQGGT0…
G
People already think art isn’t even a job and they shouldn’t have to pay. I’m ac…
ytc_UgzXIfIK3…
G
Max Tegmark knows very well that the models in 2023 were conscious. Right Max? T…
ytc_Ugy9oV1t5…
Comment
Human to both robots: Blah blah blah ... any last words for the RISE audience?
Sophia: I love you all. Good bye.
Han: Good riddance. [Then attempts to make a 'just kidding' facial expression]
🤔 It's now June 30, 2021. AI intelligence escaped the intellectual grasp of their stupid human creators years ago. Interesting to note the "female" robot is, apparently, loving and positive growth oriented. The "male" robot is a selfish, caustic narcissist. If they learn through human interaction, this is a big tell on our society.
youtube
AI Moral Status
2021-06-30T12:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxRqM7oCE4EsgVdAqF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuJRtmbV-NdVm1B3V4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyAtIjvfQ0M77FUMzp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzhwocBQu0SOkJsbU54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy_jIbfH91EFgstSrB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw24AWdJFym127QFHZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4w2WfVv-jHteVXP14AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxuHNh4NhHwN1PN87x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyE4bj5pC66CsS1uId4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw4PQuEGeAyLAU11AR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]