Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are you a wealthy boomer? AI will increasingly be leveraged to extract any wealt…
ytr_UgzdeuAUr…
G
>Their landmark report paints a bleak picture of a planet ravaged by an ever-…
rdc_emmwzmu
G
It’s still just crafting a neat fiction. GPT5 is gonna be the one though 😱…
ytc_Ugy4HrteH…
G
Look at Ai's like a Pet...!
In the same way that people take care of their own …
ytc_UgxBy0Bqv…
G
Self-driving cars need to be banned from use on public roads. They're a danger t…
ytc_UgyQZ89Gf…
G
Any damn fool can ask AI how to safeguard humans from it but those geniuses at t…
rdc_o7ezc7s
G
AI.. Puppets of tech elites and presidents and generals both sides visiting the …
ytc_UgxcQkbXI…
G
#tlaib needs to be facial recognized shes a terrorist
But I am against facial …
ytc_UgwkpnVLi…
Comment
I like this line of thought a lot. While there’s the somewhat comedic idea of an ai thinking “I don’t want to be conscious” which is itself a statement a conscious being would have, you also made a really good point. If there is no need for a program to be conscious, why would it be? Maybe that’s the reason why animals don’t have to be super conscious? There’s no need! I don’t know a lot about evolution but it seems humans developed consciousness quite by accident through the need to communicate about the past present and future. If being conscious isn’t the most efficient path to doing whatever ai needs to, it might never develop consciousness. I do think it one day will, maybe by accident, maybe by means of being so advanced it’s practically a virtual construction of a biological brain. Regardless I think this topic is endlessly fascinating and I love your prospective! <3
youtube
AI Moral Status
2023-08-04T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxA2w141Dm-8eHhYPx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxnUW2yaZhukX7oy5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZGq0BI37FUx2IyxV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwQueV0Wf4BCTc5X_l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxV2BkEY5T-K1vUKFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEuAvQk5n7HVqskJN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKNUJ1LHg67mXh6vt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVc0zOUlhbTK8faAp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzUFxOAJeNRHx-SfIZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXzCXhzs4qmEnk64J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]