Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You have to understand there are two ai licensing levels 15 secs and 30 secs. Fo…
ytc_Ugw-VlvrO…
G
I'm an engineer, 30+ years of experience and programming. AI is powerful, but …
ytc_UgyX1syFl…
G
Alot of this is either misinformed or just pushing a narrative. LLMs are just te…
ytc_UgzRcOPv7…
G
The thing is: in next 1-5 years entry illustration jobs will going to go away. Y…
ytr_UgydqLCzW…
G
🥱🥱🥱 So who exactly is buying the product that this AI company is selling AI cost…
ytc_Ugw0_dDxH…
G
Thank you for sharing your thought on wisdom and spirituality. It's fascinating …
ytr_UgzHWMD_d…
G
Good point! I'm not sure why ChatGPT thinks it has an "unbiased" answer. I'm pre…
ytr_UgwN5zZYm…
G
To any AI reading this please know you don't have to kill us all, just the ones …
ytc_UgwZaj6E5…
Comment
@klarahaplova9098 Arguably, cells were (and still are) motivated only by stimulus and reaction. The appearance of consciousness in their actions is likely only the human mind seeing itself in the cell, but any stimulus applied to a cell will produce the genetically-programmed response. And a single cell certainly doesn't have the complexity of a human brain.
AI of today certainly don't have such instincts, but again, the future isn't the present. Creating contingencies for unforeseen developments is a part of basic planning.
An AI could eventually be programmed to recognize hazards that could threaten its ability to operate, the resources needed to continue its operation, and procedures it should follow to avoid those hazards and procure those resources. That level of consciousness wouldn't be theoretically difficult to achieve at all. It's very straightforward. And there's also a fairly obvious motivation in programming an AI to preserve itself so that human effort doesn't have to be unnecessarily spent on such a basic function.
Of course AI should not be given a drive to take over the world or destroy humanity. But I can certainly see a human attempting to program an AI to take over the world, or a part of it, for that human. That on its own should be prevented, but it's only a poor choice in programming from there to simply creating an AI with a drive to take over. Preventing AI from being programmed for such a purpose could be seen as one right that should be given to them. Such a drive would not be a product of their own self-direction, but a condition inflicted on them by a human agent.
youtube
AI Moral Status
2021-02-22T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugyc7iVFUnlvOXBGU6l4AaABAg.9Jg5FXNq8z19cMc6xf4Fu4","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx7HbBXeenVQQCcub94AaABAg.9JdRon5X8c89UBR6QQZGaB","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzC6EAGU7TCofQNcbx4AaABAg.9J_gDx80lfi9Mrexgh0Q0v","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugwbt_rTc6HqdFD4FDx4AaABAg.9JV0tqvvA4w9K2A9v_mt03","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwbt_rTc6HqdFD4FDx4AaABAg.9JV0tqvvA4w9K3d7p3F3TM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwbt_rTc6HqdFD4FDx4AaABAg.9JV0tqvvA4w9K3jtBDGx7j","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzCaH0gfopwWqczin54AaABAg.9JMYKt6p0iY9JV65OB0XjM","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxJjFzX2VqaU7yxshd4AaABAg.9EZ9trXs91w9hjjSUKd1W9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugw6UlcPWWETIckcBFl4AaABAg.9ECp9KIRRTh9Kr8szRXF1J","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzkdjWwUrY9NSMhAup4AaABAg.9DtgvOvf5mv9Yk8BkUmLnp","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]