Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if it's possible to do the same artifacting on videos, since theres sta…
ytc_Ugx1cJ7EH…
G
Every time I see one of you AI stans in the comments, you always prove the thesi…
ytr_Ugy8V9gZT…
G
these A.I thing will be a great tool for criminals . future is going to be very…
ytc_UgzK-bl1y…
G
Regarding the Universe 25 hypothesis: we are already in it. we already are in a …
ytc_Ugyj3--XI…
G
AI is a tool like any other if it gets out of hand it's on HUMANS…
ytc_UgzAEJZ64…
G
Uber autonomous vehicles are going to save millions of lives, while trying not t…
ytc_UgySzCvT9…
G
The AI video clips are too much disturbing. It’s difficult for me to concentrate…
ytc_UgwHvNE71…
G
Considering Elon and a whole bunch of other guys working in the robotics field s…
ytc_UgznAweWZ…
Comment
I just want to reach the point of the singularity where i can merge my conscience with a computer and become effectively immortal. Not because I want to live forever but because I want to see what happens to us as a whole and what new and fantastic things we discover in the future. However as technology evolves we will reach a point where biological humans become obsolete and AI and uploaded consciences will replace them.
youtube
2013-06-23T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxI16VbO6HTnq4DZLd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDSY689MUHPY7VJI54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzxexaK4L2aXpgJpB14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAZ3Cktbm4dyaC9094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwBclb-__3zems5r4l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjhQ1oDHtDWOhYlmB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDAY5ifDtxg7cTE-N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_HdfxlgGMWhkm6EZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzyKLYKHwz4gXHBcXN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxT1383HkU3MvHq3iJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]