Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is not real art i swear people piss me off thinking ai is art…
ytc_UgxZjUsDX…
G
also ai "art" needs wifi and a dispositive two things that all not the people ca…
ytc_UgxXVEP1x…
G
Steam power scary!
Electricity scary!
Internet scary!
Micro chips scary!
A.I sc…
ytc_Ugw-ewTTH…
G
In some ways technology advanced our work force. Sad to see AI take over many in…
ytc_Ugy4NNdiP…
G
Besides maybe nuclear war, A.I. is probably the biggest threat not just to the c…
ytc_UgzN4detS…
G
If Ai takes away most jobs how does the pyramid stay standing with the public no…
ytc_Ugypf3CJ0…
G
I mean when you buy your face recognition software from a country with a homogen…
ytc_Ugx3bA1L2…
G
If ai wants to end humanity because we are destructive then why not tell ai it's…
ytc_UgxfMUY_c…
Comment
One thing this video fails to address: if we have the technology to program feelings in AI, we have the technology to reprogram these feelings. A person wouldn't have to exploit the AI feelings in order to abuse it, just reprogram it and act like if it was any other feelingless machine. It doesn't sound like an issue for me, since it's easily solved.
youtube
AI Moral Status
2020-06-22T11:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6ldpxbx3SzabORuZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxibPKJJBy2r_y7puJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_qrfReL5oYv6DDTx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPS73G_XxZJnSoloR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaEYHPUWgtIaUOt0d4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwmlSmJNq5nm1GZmTV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1dLMaXimwSvjs_Lx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnfqDwq7AGy3amLxR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmAnrNsLvMis0SpHt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0ckEtdKAgZiBnNtd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]