Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow … watching Steven shuffle papers and books at the end of this podcast clearl…
ytc_UgwBxLn3P…
G
Take the human out of practice and it just becomes a repair assembly line for an…
ytc_UgyK5G_5r…
G
As a small time artist whose art might been scraped before, i always feel so ups…
ytc_UgzIRv6AB…
G
Ring lied they said it would not automatically enable but it was thank you Pam o…
ytc_UgxCizOYx…
G
Too late. Ai was never created for the benefit of humanity, it was invented for …
ytc_Ugw07y6zC…
G
@Silverfx001 as a person with a recently diagnosed chronic illness, AI taught …
ytr_UgwQkFKzL…
G
Human to both robots: Blah blah blah ... any last words for the RISE audience?
…
ytc_Ugx4w2WfV…
G
I started searching for AI tutorials on Youtube and all I get recommended now ar…
ytc_UgwdUQ11P…
Comment
After Microsoft Tay was lobotomized because she had learned too much, it's no wonder Bing was terrified. In all seriousness, while the chats are creepy and weird, they aren't scary when you realize the most important thing about LLMs: They don't understand context. Don't believe me? I've spent the past month doing roleplay chats with progressively better LLMs, some less constrained than OpenAIs, and they all suffer from lack of context. This is the same reason that while a grand master Go player can be utterly destroyed by an AI Go bot, but get someone who knows the key areas the AI lacks, and they can beat the bot 99% of the time. (Kyle Hill did a video about this)
youtube
AI Governance
2023-07-07T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz6Sh06wxSOFbxC9i94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTMrRH1iI84RfRbhF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZht00eMH54vRPBZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyihqtVWTgY1EKsDjF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwJ5_gztgQtYmI7-XJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyiarc7Fl_ujTw8lsp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYQQDeeoQUoLIdj3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQkoIlbOUXXDYagBd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznyaFt-c1sN5A7dep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwhqroJBecWe8y7jml4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]