Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Concise. The Military use is destructive. Human assist uses could also be destru…
ytc_UgxnmUNzb…
G
Always hated the idea of an "autopilot", even a real pilot only uses an autopilo…
ytc_Ugzi39MZl…
G
I'm thinking the self-driving car would automatically slow down to safe speed so…
ytc_UghZ3KaDp…
G
Bro Art Replaced by AI 💀 Like Ghibli art😢😢
betaa Hindi bolni na aati 😅…
ytc_UgzZViCte…
G
Marketing moves, My GPT sacrificed himself for me . So with this I presume Gro…
ytc_UgzUK06lk…
G
Agreed, let the animators animate but dont overwork them, ai implementation woul…
ytc_UgwF-D8i9…
G
Catastrophe!!! Pour les bureaucrates c'est une mine d'argent - leurs salaires. C…
ytc_Ugw8mmb_p…
G
AI is very dangerous. AI can easily be used to harm and inflict wars and destroy…
ytc_Ugy_ZFNPp…
Comment
I believe that combining biology with machine will solve a lot of these problems. Perhaps one day we could be able to establish a baseline for what a living being is, and so create machines or bio machines that have very basic functions which don't need rights. It's also going to be much easier for us to see advanced AI as living beings if we meet some extraterrestrial intelligent lifeforms, as I believe that there are some out there that look and act very differently from us yet they're still conscious.
Whatever is going to happen, I know it's going to be "bloody", humans are too primitive for such concepts.
youtube
AI Moral Status
2020-08-25T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxMCKqDtgnNfcH5bhN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwYR9O-VGQOcl-5i-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNUaUyRx3aIlKu1V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rZKvgKFGyggcJCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxqx8su6wktyNKm1Ad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-n-E8LayJ88lzLLZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8d1lOlGg65sIDVuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvpBSM5VZTNVScXLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxq0LqphmBYNKFlsDd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgygZF5E3ttKUqK02ul4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]