Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don’t forget that Microsoft is the company behind the OpenAI company… nothing to…
ytc_UgyAQdyqK…
G
I think that it's far more nuanced than that
For example I have been generating…
ytr_UgwmP99kU…
G
Positronic Man by Asimov is an excelent book that discusses this in depth. Its t…
ytc_Ugg3Gwx2P…
G
I’m involved with decorative arts and can see the value of my work increasing wi…
ytr_UgwYbwuSO…
G
I actually find these chatbots and chat-type AI uncomfortable to use and will av…
ytc_UgxiA3ASq…
G
Soon or later robots will take over the world if man give a robot a gun, they wi…
ytc_UgzAG6ZBM…
G
The US is going all in on unregulated AI and crypto and I guarantee this ends in…
rdc_melszfv
G
don't worry more people will end up in hospitals because they think they found p…
ytc_UgzcAWmS-…
Comment
I feel like this and the related topics have been discussed quite extensively in SciFi media. Do Machines have emotions, do Machines want emotions, do machines want anything or are they indifferent to it all whether it's being destroyed, not destroyed, doing something or nothing. How would the interactions between a sentient machine and people look like etc etc etc.
For completion sake: substitute machine with AI, Robot or whatever.
youtube
AI Moral Status
2017-02-23T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiyjzCTc8g_oXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgiKV5roAM8drngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghulkD-qy2L3HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Uggnize15yoAyHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjOlPQd5Ca5sHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggcGK52nAlrHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjJbiJBPUbWdXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi26oYgcaYTAHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiPFrZsBn3iMXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg9RewNiCIchXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]