Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think you should try using LegalMente AI’s AI paralegal named Para which is tr…
ytc_UgzfoFZt7…
G
Waymo STILL is not profitable. Aurora will NEVER be profitable. Its just a gimmi…
ytc_UgweTs9EW…
G
What about copyrighting a photo? You’re basically just capturing something from …
ytc_Ugza7IzFm…
G
We will have hybrid Ai human and live 150 years plus steam cell can be regenerat…
ytc_UgxbJQiK5…
G
Also "filled with a persons views and ideas about the world". Do you think you h…
ytr_Ugx8fCyMf…
G
Schooling has been the same for the last 125 years. I think any change is a good…
ytc_UgzxQMuY0…
G
humans try to rule the planet all the time, and they Do, ai is smart enough to …
ytc_UgyfQb4tz…
G
It’s too soon to tell. At this stage it will be down to stupid managers who bel…
ytc_UgyXhjHav…
Comment
Glad to see someone actually talking about some of the real dangers of AI. So many people can't even see 1/100th of the picture. This sheds some light on it. I knew some of this, but even this gave me a few new perspectives. All that really matters is that AI is going to become too powerful and because of that we really need to not let AI take away our humanity and freedoms. I personally don't believe AI can ever become sentient, conscious, whatever, as long as it is still made of "non-living materials". I think it will all be mimicry, but it will be able to do it to such a degree that it will seem indistinguishable from any other human.
youtube
AI Moral Status
2024-09-02T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyFqlJLb3JutPvYx8R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFWYKiDlHellSywvl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwbe0eG9BoFg9wZfmJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxpq4ftNFTFULkL2MF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDrzyjV2bLgTubnVl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsT-Zrh7DHGW0nT1h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBYOxEtqdZTL-RqTl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxGLNnp882HA7IRKt54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx1j-w_SRbPoUGJrT54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZOhBgXfQNF4RkGmd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]