Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The truth is that all jobs will be a thing of the past. Jobs will disappear beca…
ytc_UgwbKd1Bt…
G
Mindgrasp Ai is the app he’s talking about why not just stop wasting everyone ti…
ytc_Ugw2BLrXe…
G
ai can't think, it can't be an engineer, however, it can be a good spell checker…
ytc_Ugxy5k7zD…
G
AI is getting smarter and can teach us everything and do all knowledge related w…
ytc_UgwpX8dr5…
G
The “answers creating more questions” comment seems very accurate to me.
I thi…
ytc_Ugznj6spK…
G
I once asked an ai to make a stick figure. It took over 2 minutes. While also lo…
ytc_UgxebrXaz…
G
The other big difference between people learning and ai being trained by a datas…
ytc_UgyvWwHPa…
G
travel 360 degree how do you know she didn't look both ways Lol. The driver saw …
ytr_UgyhBppeF…
Comment
I struggle with attributing consciousness to something that isnt thinking all the time. Im always on, even if I stare at a blank wall in a silent room im constantly ticking... The ai shown only thinks when given a new prompt. To me that is just reacting. Even with chain of thought its still dependent on something triggering it. If someone can show me an ai that spends quantifiable processing time on a null input ill believe it is conscious. I think its the ability to plan ahead, is a key part that ive not seen without it being explicitly instructed to do.
youtube
AI Moral Status
2025-06-11T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyY_0ODukXPmJVVwb94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiPFmXMBQ4rEfQepF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwn-gbuYt0Sv-J37594AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxaOAuJap52ZUktpmB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJlXK6dcLoTAWz84t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyhOdw9cRT_IUy8l_J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNz0cL5PLU1aBGBsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzwnf5whPxetELiOal4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZbSgydW5l3t9ur0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjC4vUTXrv0i4Ka9J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]