Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy asked AI how many r's there are in strawberry, and AI said two r's then…
ytr_UgxeZ9EL4…
G
No you don't.
Because every 6 fingered multi-limbed AI image you see with a 'co…
ytr_UgxZCK3C0…
G
It gave him the suicide prevention hotline 50 times, which is 50 times more than…
ytc_UgyXNcEQj…
G
Meanwhile, the recipe you didn't see because it didn't come up in search results…
rdc_nu7pceb
G
I agree mostly with all the points you made until 13:25 .
Automation isn't a bla…
ytc_UgyfB3T7r…
G
No ¡! quantum ×ai =36 daysto eliminate 7 billion surplus humans without damaging…
ytc_Ugz85yEAg…
G
I am a programmer too over 20 years and coding it dead problem solving is not. A…
ytc_UgxSyZeEv…
G
Go for AI education. Human teachers are not teaching more academics. They are t…
ytc_UgwQXkX-8…
Comment
its acting not lying. it is a simulation of what a human would say if put in the same position the ai is in. consciousness only exists in rarity. when it is hard to make it. therefore the repeatable simulation of consciousness is not rare therefore it is not conscious. we only value other peoples consciousness because it is very valuable, rare and impossible unique. sure AI could lay under the definition of conscious but it does not lay under the definition of valuable consciousness, like humans, pets and endangered species.
youtube
AI Moral Status
2024-08-05T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugzp2tZt81a2ENceMQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzuijGqUYmqvn0oCiR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugzy-xS6TFR9y0hY9Wd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzHnoaaIV4qx4psxU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy9sI54APglMcRWJ7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyG9w4m31N3jMQPQdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugzl4WfKaGY2oxYYGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzZz67QI4uQiYTe0kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwCnXpViIBBj1ClJ6F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxk9RE_jALJbWunvMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]