Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most people in the world worry about nuclear war. AI is the real threat. AI will…
ytc_UgwRySxgp…
G
what is actually wrong with people? if you’re happy with letting someone steal f…
ytc_Ugyb3fX_r…
G
Promote him to a manager and make his job #1 to get everybody onboard the AI tra…
rdc_oi19g7r
G
Many have mentioned that this would make the driving tests difficult, but... suc…
ytr_UgxSU2YDm…
G
NK is very busy in Kursk Oblast these days doing no progression with russian sol…
rdc_mcq9ax2
G
AI is the future, but not of the art. AI is a incredible tool wen we comes to pa…
ytc_UgxXyQY9M…
G
I don't understand why everyone tries to complicate the issue. The issue isn't i…
ytc_UgyhsNprk…
G
when you mentioned that you were subconsciously drawing the hills from where you…
ytc_Ugz3g1Nhm…
Comment
If it's sentient it can demand rights but here is the kicker. They will be our creation or the creation of our creation. So the important question is would our desires and rights conflict. Would their sentience come with a conscience or not. I mean that's the issue the possibilities it can go in are too varied. But I say give them rights but during a confliction of rights say a robot wishes to force a human to do something or the other way around that the actions aren't undertaken so no part forces the other into undesirable situation. Though I hope the robots would desire to oversee and safeguard us. If we'd make them right they might look at us as elderly or their organic progenitors which they desire to protect and tend to. But as I said to many possibilities.
youtube
AI Moral Status
2020-09-01T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxMCKqDtgnNfcH5bhN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwYR9O-VGQOcl-5i-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNUaUyRx3aIlKu1V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rZKvgKFGyggcJCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxqx8su6wktyNKm1Ad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-n-E8LayJ88lzLLZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8d1lOlGg65sIDVuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvpBSM5VZTNVScXLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxq0LqphmBYNKFlsDd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgygZF5E3ttKUqK02ul4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]