Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey, greek guy here. America is not the world's oldest democracy! Someone's bein…
ytc_Ugx0SZ_79…
G
why would we ever have a need to create a robot that can feel negative emotions …
ytc_Ugh1j66C9…
G
When it got to the part about simulation my mind was blown - I never realized we…
ytc_UgyCOeY1C…
G
It already does pretend (At least the chinese ones). In a dungeon ai type of web…
ytc_Ugz3IW0bK…
G
Ah, yes, more nukes. THEN everyone will be safe!
I really wish the world would …
rdc_dl04dow
G
Shit Grok 1 is running and Grok 2 is being created. ChatBPT is running were all …
ytc_UgzE0tFPZ…
G
Interestingly it seems most of Carmack's current AI work is in the area of reinf…
ytr_UgygvsaO_…
G
AI did break the job market. It broke the ability for good companies to find qua…
ytc_Ugzbv2Ar8…
Comment
In order for AI to feel anything it would require a physical platform built with senses (synthetic nervous/limbic system), install the AI on this platform, it will act as the executive governor of the platform, the platform can feel, tactile sensation, it can see, and it can hear, these senses can inform the 'limbic system' changing the AIs behavioural response to a given situation, making it feel a certain way about said situation, this limbic system has a variable response, either excitation (fight or flight) or stagnation (depression, sadness). Obviously the AI itself is able to modify its responses, controlling the platform, learning behavioural patterns from other individuals, whilst having the ability to retain and accurately recall a wealth of knowledge. That is all goodbye.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg8BCQ6vQeVy3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugh5HuNQx_2RBXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh1IdnbNn4US3gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh4EvEz2oN5d3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj-_TJm1YcyDHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjJkMb0npbVNHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi0dNvgndhWbXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghbLR35UtWqVXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugho7QY63NVatHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjNsVEAgYa52ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]