Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This convinces me that COMPAS doesn't just need to be revised or "fixed", it nee…
ytc_Ugz3bDTAP…
G
Never give a robot a gun without keeping one yourself and the other finger on th…
ytc_UgwtPUG6q…
G
@LG4452-t1eit's a tracker in the battery bruh, turned off or completely dead do…
ytr_UgycjNXGm…
G
What rights? If you put something out in public, it's a publicly viewable work. …
ytc_UgzWd8pd3…
G
I don't see how a union leader can say they aren't 'anti-corporation'. You are l…
ytc_UgzsZZaQM…
G
The late a organization embraces ai the more regret they have .
The advantages a…
ytr_UgxArmF3y…
G
This is terrifying. It makes me want to lean more conservative, even, because I …
ytc_UgyrxHc0X…
G
If people don't have jobs/careers we don't have an economy. If we don't have an …
ytc_UgzSfKPM_…
Comment
Very interesting and somewhat alarming. One thing I disagree with is how Hinton grapples with - sentience or consciousness and emotions. I question whether even neural networks can spontaneously generate - feelings or emotions equivalent to what we mean by those terms. I do not believe we humans are ‘programmed’ in the same way a computer is - to only do as we’re instructed. I can’t see an ai developing emotional attachment as Hinton expressed in his regret at not spending more time with his children or a spouse. This is not relationship to achieve some materialist ‘objective’ - rather it’s an expression of desire to reconnect with joy & love. I just don’t see ‘machines’ getting beyond anything other than mimicry of that feeling. Of spontaneous generation of sentimental attachment to the point of self sacrifice for the good of the beloved.
youtube
AI Governance
2025-06-17T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz6rvXVPpE2qUY6hTh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8P5NpObGnTdRivDN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnI4QSkpY2wr0b_CR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztFH2N_HtSrHmd13R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXor0_U87hN8ow23F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyhD2jQHDJu93g9zjh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyptBvb-xnYzaZKjH94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxe9c3Ddex8HlrjM8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyvw-r7ZMpDnmSZDnN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0aeun5kprBxSnt7B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]