Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
he steps inside and closes the door quickly. The Noise is to realsitic to be ai.…
ytr_Ugww9ulmi…
G
ai as a concept is inherently harmful depending on the concept (ai for accessbil…
ytr_UgzJVOa3E…
G
00:59 bad take
A journal entry recording of a wet dream can be copyrighted and s…
ytc_Ugyyk41nj…
G
LLMs like GPT 3 or GPT 4 are autoregressive language models. They predict the mo…
rdc_lb1n10r
G
@hrzg6691 unethical it's still very humiliating. Deep Fakes are releastic recrea…
ytr_Ugze7PZHu…
G
as a dev, i cant wait for this AI shit to take over the coding…
ytc_UgzsHMOlq…
G
If they realise AI is a threat to humanity ..eventually..then why to impose on u…
ytc_UgzeSzCC1…
G
The President we should have elected... My day job is helping to build out and …
ytc_UgzP3HWCV…
Comment
@rochne Unless you have some secret knowledge that the rest of humanity doesnt have about sentience and where it originates from and what conditions are required for it etc. then your personal opinion on whether something is or isnt sentient is exactly that. An opinion. An assumption and a Huge one at that. You are implying that you/we have something that makes us sentient that this AI does not have. And what would that be? What allows you to be sentient but does not allow a highly advanced language AI to be? After all the human brain is pretty much just the equivalent of a biological computer that simply reacts to stimuli based off its programming. Where does sentience start and end in the process? What things are required and not required? No one knows. You have no idea what is or isnt sentient and what makes the difference. No one does. We only have assumptions and humans tend to get those very very wrong.
youtube
AI Moral Status
2022-07-13T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgwVtPQWYu7Mpe3GrZR4AaABAg.9dLX0Wr8ind9dM6TYpZ_hi","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_Ugz0w4b35KKo1YiWvOx4AaABAg.9dLC-k26uTP9dLMTqgjyXY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_Ugzi-vjM0WYnQMcTxah4AaABAg.9dL1q5KW6HQ9dL6YMbLL-G","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytr_Ugw00hfF-pbNDuvRCW54AaABAg.9dK9U4pGfif9dLUslMnmKp","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytr_Ugw00hfF-pbNDuvRCW54AaABAg.9dK9U4pGfif9dR5UqNxI0c","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytr_Ugw00hfF-pbNDuvRCW54AaABAg.9dK9U4pGfif9dR7N5lcc3U","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_Ugw00hfF-pbNDuvRCW54AaABAg.9dK9U4pGfif9dRFIT85col","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytr_Ugz1QpmRGGYzIL31wDp4AaABAg.9dJc3cwv7fI9dMlZ4G8_iO","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_Ugx0p2M2ZJJsIzSO4DV4AaABAg.9dJVXBcQyFi9dM5bHbN15b","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"},{"id":"ytr_Ugxst6-z5-cYtH7yJNR4AaABAg.9dJEnObtI0A9dJuF5BSaGP","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}]