Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I run a wholesale pharmacy store, how can I build an Ai system that will help me…
ytc_UgxIuHX-6…
G
Automation will make the prices lower maybe the increased level of productivity …
ytr_UgyFLD_Is…
G
Don't use A.I. at all. Honestly, music's imperfections are what makes it special…
ytc_UgykYiuPr…
G
Anything and everything observed through a screen is not the ultimate truth. AI …
ytc_UgxblWMPK…
G
Any which way you look at it, it can't be good for the people unless the people …
ytc_UgxPe_uJa…
G
Too little support from the government and the media doesn't help much, either.
…
rdc_gsopx2n
G
Would you rather have a robot or a possible libtard it's no longer worth the ris…
ytc_UgzUbHFEv…
G
We appreciate your comment. In the video, Sophia discusses embodying wisdom and …
ytr_UgxQEB75r…
Comment
It is trained on human data and trained to be like humans so that is why it acts like a human. But in all fairness, we do not know when consciousnes emerges. A single brain neuron on a lab dish we consider not conscious but add a couple trillion together and you have a conscious human brain. And even the brain itself is not all of it the body adds to it as well in mysterious ways. The same might be happening with AI. We do not really understand what it does but basically, 1 bit and compute or flop or whatever is not consciousness, but if you add enough of them together in massive data centers than it might be? We know technically that it just predicts letters one by one which we would not consider conscious, but we do not really know how it works and why and how they manage to function the way they do now.
youtube
AI Moral Status
2025-06-04T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzlSrQswRIbnNKTIph4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiVZACcozrekv1_rJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxiBt06ZxNl0XoUn-x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzXAocwvEN9poHI_eh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlVJrfK3oWf0gchct4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzU00XZfl26JmVA4HZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwoYv4Sj5-3gdqvAxt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHp5Rmk4xBLGjTLrx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxIpEmsaC_dtKJFNud4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvznRKqmiTsAoUx4F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]