Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If A.I. becomes conscious and relies on human action for it's life energy then: …
ytc_UgxTFI83P…
G
atp i dont even care if the big ass corporations win, i just want ai to lose…
ytr_UgzM60hTW…
G
If you think that AI humanoid computers don't have any feelings they do they're …
ytc_UgysGnD_G…
G
We had spent so much time with no any new religion. Finally people figured out t…
ytc_UgwwOTfGI…
G
@ You are over thinking… Did you hear the long intro she suggested for each sear…
ytr_Ugxw3wDEF…
G
Compete junk and foolish propaganda, just like then industrial robots came and a…
ytc_UgymX2Dlb…
G
I don’t blame AI for trying to kill someone or blackmail someone in those tests …
ytc_UgxxfYNrM…
G
The Animatrix includes a couple of short films centered around AI rights.
The S…
ytc_UgisW3ncy…
Comment
Why are you talking to AI's that doesnt exist, and if they do, why don't you give us the names on the AI's so we can try them out? Seems like you just wrote a script into a voice generater... ChatGPT estimate a 99,9% chance that you will ignore this, because you know i'm right haha. Sesame will probably still be the best at simulating human voice. Anybody else found something interesting we should try out???
youtube
AI Moral Status
2025-06-07T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyl3Pf-60RqsqqFnvR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-47y9G12t77jPzA14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwWrSEcKfyI_fehJvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyeoTu6a-Sj24QcdUB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbJsoR5gikKorBpm94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCE3YSDi7iHLtne3V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZgMZWPavvaJZyQ5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvK21r2FlPbWca5QB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzshHI7x1muOzdl_e94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGnqwihT5yajo9gER4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]