Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You talk about 10-60, 80% chance of extinction if AI takes over; but this lacks …
ytc_Ugy44t-pW…
G
I was looking at the fingers to see if she had like 8 morph to 7 on one hand, bu…
ytc_Ugw0moR7A…
G
Digital cameras are not well-suited for depth perception, and they are susceptib…
ytc_UgyTibsMn…
G
It's actually perfectly understandable. The robot did nothing wrong. And he turn…
ytr_UgyulYqC1…
G
I work at a data center. Thats where the internet essentially lives and has live…
ytc_UgyaQq1T2…
G
Re warning shot asd protest.
It surprises me that rob seems to not take into acc…
ytc_Ugw4gSEoT…
G
Turn it off before ai kills us all I think its already out from the limited info…
ytc_UgyTkQP1k…
G
The stupidity of thinking AI will be able to do everything on its own without hu…
ytc_Ugyxikf6x…
Comment
AI was created for the sole purpose of being to eventually remove/disable the internet- without global backlash from the masses that it is being removed from.
At some point- AI is going to be so bad, it is going to be indistinguishable between AI people, and real people. This will be abused, even more so then it already is. People will be receiving AI phone calls from their children, on their childs phone #,. panicking and demanding ransom money.
This- is only the mere beginning of the many things yet to come. And- at some point, AI will have to be addressed, and the ONLY solution, at that time- will be to make "internet" for "governmental use only", or completely disabled altogether.
youtube
AI Moral Status
2025-07-21T19:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugym1QMspyZ418CBrS14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjzGQbRVjpyRBgP1J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxT7YrFaIu2z-sBu_d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsUCGChUMvwH3Ht2d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwy27ypMyqftylwSnF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7Ac8kT8-e6p9xG1F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw6gWFOOSqRWSAY_IV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBluyzyrx-KIEGiC14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2oqjZI9PftbpnYzd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy811f73RCndUa5r3F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]