Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Mathematicians do more than addition and subtraction...so yes, calculator's were…
ytc_UgygwmgFG…
G
@imthinkingthoughtstake the example given & remove the bits about the verifier, …
ytr_UgwePVVbM…
G
When he asked about the time flies like an arrow thing, i was looking away from …
ytc_UgyycBUTS…
G
S0, humans are taking the robot's jobs, which would be nice if it wasn't a sca…
ytc_UgyOmqYQV…
G
I would point out that Luddite isn't the insult people think it is in THIS SPECI…
ytc_UgzlC5WmW…
G
Nah, its actually pretty awesome. AI is a beautiful thing. And it should be left…
ytc_UgzoI6v6q…
G
The most secure human-built systems in the world can still be hacked by humans. …
ytc_UgymAA_2j…
G
AI gives wealth access to skill while, at the same time, denying the truly skill…
ytc_Ugx6qJUYh…
Comment
What we call consciousness is just inputs triggering sensations which we call emotions. It's just a pattern matching machine on top of a set of evolutionary priorities (which often act against each other). If an AI does take of (and it will) we don't want it feeling the same way we do. It is our inability to look past our first emotional response that is killing us. We have people who create solutions to problems and the world tears them down. Why would we take the advice of an AI even though we know it would have far more power at its disposal.
youtube
AI Moral Status
2023-08-22T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwnMBuSmTms7aPDmz54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzJlPvCuvJZP5rL8kV4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzu_xGtkZCOzInafot4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_ss8pfsNbKC_xM5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVQROburLMOyrIhxd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwqP4ym4sNzupuCJzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_jInzTho4GS9gmDh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5ReD7qnnMzqlGJfZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFNxcoaB8F5vopMvV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyA8jnIOvbkaV-nmE14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]