Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI hasn't taken a single job. That's a lie by companies that want to sell AI. R…
ytc_Ugy3ThxPX…
G
elon :its not like terminitor
also elon: The AI willl be in control..
So skynet…
ytc_UgyK2jRBO…
G
I work creatively….and Ai is moving so fast that I won’t commit to a whole year …
ytc_UgxDykli5…
G
Maybe this is why they are so desperate to colonize mars, so they can escape the…
ytc_Ugxlqyyxd…
G
Sounds like AI will be a modern day tower of babel in its example of pride and o…
ytc_UgzP3jv6Y…
G
The hope is that AI take over and we become their pets. AI will free us from the…
ytc_UgyO8FNwL…
G
The AI has reached the sentience level of these people, so they can't tell the d…
ytc_UgyYXaJ68…
G
It feels like a complete slap on a face when you would spend more than 60+ years…
ytc_UgzQIc-Q9…
Comment
The super intelligence idea requires 2 things that we can refuse to give it. A; access to all the other systems. B; no competition.
You can build a very powerful intelligence, but if it's isolated to its hardware, and it's connections to outside are low bandwidth and variable, it's not likely to "get out."
And if it gets tricky, getting out anyway, that's where the second part comes in, competition.
If it's not the only AI out there, we can have AIs that will stop it, or at the very least, warn us it exists so we can unplug it.
youtube
AI Moral Status
2025-11-12T14:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7oLyvtb6jtFUxq2d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxDd1QMabXcl1iOXlp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyI2CdGrmKush_pDXx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRSNWIbg0tFhECTuJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwjYrmvwnN6Tzck-LZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxEXf04263f-vGGRK14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy1ZWTzzdBK2oO82Z94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy1NcJ1h4cYIi4mKNl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwApxVUk_DAYewRb3x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxp6ZoNmkhtfSIQ5wN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]