Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea that we didnt know what AI could do is bull...Terminator came out in 19…
ytc_UgxDWWq1L…
G
i would phrase it like the AI is racist because we are, though i see your point…
ytr_Ugyg1godq…
G
After firing the weapon the robot should be programmed to remove finger from tri…
ytc_UgznfjTyX…
G
Ever been to a wax museum, Ever watch the six million dollar man. I have been wa…
ytc_UgzbVjZm8…
G
Working in the semiconductor industry for 20 years, there is nothing about what …
ytc_UgzugsPcK…
G
That’s horrible!!!! The only people that would give a 💩 about this is a very wea…
ytc_Ugxx7_50m…
G
People: We want cheap stuff!
*cheap stuff gets made and delivered cheaper with a…
ytc_UgzinYIa1…
G
a rocket is a form of propulsion. Its not a form of transportation. For "intelli…
ytc_UgwxEfy2_…
Comment
Emotions are just psychological shortcuts that save energy, because they're less "expensive" than thinking and also allow communication of unconscious mind with the consciousness. I don't know how AI algorithms are structured, but I wouldn't be surprised if they come up with a digital version of emotions/intuition (= a shortcut process), especially that their main goal so far was to satisfy human assessments. But I guess it would be silly, if they made certain emotions unpleasant, but can't exclude this possibility, because this could work as a deterrent in case of harmful actions, so there's a widely used in biological organisms logic behind it.
youtube
AI Moral Status
2026-04-09T14:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx_aI_7Wp8_kO-qXW94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNnW5isn5ppEE8ol14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPlhh6ZIw45ozwGBV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKRraNSoq6QD2lljx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6601iNiqBqdnR62J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwc_fE5oHHSe4GOSdp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxfx_lfa6BSIW2zgwV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWAl1_dcGS5uPSdRZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwH5GsyI9vzvXPhEyJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxzynkcXZSq_nRCqcx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]