Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let me put it simply. Your brain is a neural network, literally. It's simply neu…
ytc_UgyWR7uT5…
G
0:38
*Sound*
Un sound
few
Power
Structurally
Thank you dear ❤
(get the br…
ytc_UgyKbrv7y…
G
I don't like the fatalist type of statements. I wonder if any of these people ha…
ytc_UgzOKtR_a…
G
AI art tools in their current form are heavily limiting.
An artist ought to ha…
ytc_Ugw7y3VvL…
G
Let's just say you're an artist using ai. You pay for your prints to be put on c…
ytc_UgwDhRi9c…
G
Truth be told ai art and ai Content needs to be regulated because pretty soon it…
ytc_UgwbxyfBu…
G
Say what you will about AI being used in war and weaponry, the AI drones that Uk…
ytc_Ugz781chg…
G
I think it's very telling that the AI repeated the telling of the secret and not…
ytc_UgzF7BGGr…
Comment
If AI develops consciousness, would humans even know? The digital and physical worlds are so different, I suspect it'd be almost impossible for either species to detect and recognise the conscious thoughts and sentience of the other. To AI, we would be just more noise in the data, while to us their conscious action would likely be seen as errors in operation. Sadly, I think it would be almost impossible for a genuinely conscious AI and humans to communicate in a meaningful way - simulating a human response would be the only way to AI to bridge the gap, and effectively that would mean the interaction would be little more than puppetry, showing us a mirror of ourselves.
youtube
AI Moral Status
2021-03-27T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxsMF7gHjK3qAe0zHx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx57xXCi7G6Tls6YWJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyR5qQOvSQA-NTOowN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxw8StzeSLoStQR1M94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzoQPjd4S86Yxge-1N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6Md4b-IXOcksjYRt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyxmys-7RfiOjkkDAZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugyia_exCG29isaYiuJ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzvKIDAtBjCZfdNnl14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcwkaOfmNA5P2SnI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]