Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I appreciate your feedback! Sophia definitely has her own unique style, but it's…
ytr_Ugx6kicev…
G
If AI is to help humans means why tech companies like Google,Microsoft,Facebook …
ytc_UgzEdq1QN…
G
@LaurineVKit's been doing that since virtually the beginning with every AI progr…
ytr_UgyODJjXy…
G
I like your videos keep up the good work also, I’ve had a feeling for a long tim…
ytc_Ugz5cU4g7…
G
He doesn't even know the definition of autopilot. Which company payed him? Waymo…
ytc_UgxMNJ9Zv…
G
5:06 I'm interested how that argument goes. It certainly is not self evident. I …
ytc_Ugy1ISGTt…
G
If I were him, I’d try arguing that the “art piece” is “unique” since you don’t …
ytc_UgzrEzi39…
G
I like the one ai bro in this comment section with his real life name as his use…
ytc_Ugx-k-qHY…
Comment
Honestly if AI ever evolves to the point where it can evolve *itself* and multiply we're talking about actual 'artificial' evolution and we'll basically have created a completely new race of beings - probably sentient. Think about that for a second. Let it sink in. That's a mind fuck isn't it? However if they actually, for some reason, develop a 'will to live or exist' we might start running into some issues when they notice we don't really share our planet(s) with others that well, not even ourselves.
youtube
AI Moral Status
2021-02-04T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyc7iVFUnlvOXBGU6l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7HbBXeenVQQCcub94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeA7-O_aOfbcsmgP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRs2CZ8ylrG3el-SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV4RRzZTZ5CQnumu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwbt_rTc6HqdFD4FDx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNLABXwwMCSejWi-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi6JkMBrfnY-LLLa14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxLDWG0S8en94uHLQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw85IDHzW72BMbuInx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]