Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh no ai is being too advanced let's do everything to make it work bad…
ytc_Ugw1RF7Hw…
G
Then that one hacker from a random country comes in, and you lose access to your…
ytc_UgwiLDNzU…
G
No matter how powerful Ai its still need Human to monitor conversation and infor…
ytc_UgyRCSa-f…
G
hey can someone give link to chatgpt real website/app ? because I can't find it …
ytc_UgxTb68Np…
G
No surprises here. I have long suspected that artificial intelligence will just …
ytc_UgxdkNlMZ…
G
I had a whole conversation with my ChatGPT, Duo, about this interaction. Then I …
ytc_UgzMi8eM1…
G
7 billion people being led to the cliff’s edge, and by a few dozen of the worst …
ytc_UgznQoTHX…
G
Elegant Elon has been proclaiming AI's harm for several years & that AI will tak…
ytc_UgwEK77xL…
Comment
Gemini doesn't have the ability to interface directly with Samsung Music like the original Google Assistant can. It told me that Bixby can, with an exact phrase to command it. Bixby responds by saying it does not support that function. When I told Gemini what Bixby responded with, it thought for a second and then closed the conversation.
AI is not going to become sentient and take over the world if doesn't even know what built-in assistants are capable of.
youtube
AI Moral Status
2026-02-26T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwe0HxlroXScWsUkm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvqNlom6u4DB4JlFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQlmmHIjG4vND3Bc94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSW0v2YCPDMUjbL8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpgNNEY5lX2r6xbLV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxSrRL4C5ORf-37BTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzMdWXXx5T_m_5Fq6d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLFd_x34RVyvJ-Lct4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvJrUvmOZoViu2AJJ4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8KhCdY-uLbwjHc_p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]