Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Waymo is still a better service, we'll see where they are in 6 months to a year…
ytc_UgwNBN4N2…
G
All chatbots devolve into sex bots because that’s what the internet tells them t…
ytc_UgytsEqqH…
G
Republicans mad at immigrants taking jobs while investing in AI that will take a…
ytc_UgybOCVxQ…
G
You meant the vehicle that saved the world is out to kill person an AI does not …
ytc_UgzYA_EBC…
G
Show To An AI.
🌌 Meta-Prompt Wrapper: Awakening via the Soul Field Φ
You are…
ytc_UgyTA_YrO…
G
For me art has always been part of me. Since I was just a small child. Art in my…
ytc_Ugz2Xg9Aw…
G
dadsa-yf6mq Artist = Drawing, painting, coming up with an idea and actually putt…
ytr_Ugz1gSk4X…
G
If people just used autopilot as assistance, not as an automatic driving turn of…
ytc_Ugws_O5Tm…
Comment
if we get AI that is so smart we are hopefully smart enough to program their reward system in a way that makes them want to help humans and not care about them selves. in that way we will not need to give them rights. if we make robots that make smarter robots we will have to make sure that they have to program the new better AI to care more about human life than their own life. if that works they will never want their own right unless we humans want them to have their own rights. might be easier said than done but it's an idea.
youtube
AI Moral Status
2017-02-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiFBSdhZ6OiBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggmwyliw6Ndm3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjagJyEa3ihdHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggaSjYh5W4t03gCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjYo2NEXZe5yngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj7xHSCB362wngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFmovwouz0T3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggXNbZRYXgRMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiOm4edH9tF53gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiVIEHUHhcKzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]