Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI-Prompters are just people, who commission art and refuse to pay for it.
Jus…
ytc_UgyhizK__…
G
If anything needs a rehab it’s the public school system. It’s archaic and toxic.…
ytc_UgyFv39dg…
G
This idiot is already making me want to get so far out of the market and anythin…
ytc_UgxrVBs3J…
G
ChatGPT doesn’t use the word “uh” in between words, this was a recording of a hu…
ytc_UgwnLJ10Z…
G
We're still at the phase of generative Ai is a system/tool that can be very help…
ytc_Ugy9TOIqz…
G
AI image generation is actually an outgrowth of image recognition software resea…
ytc_UgyN7QnRm…
G
To fellow AI researchers and students. I am sure you are reacting the very same …
ytc_Ugz5at2qW…
G
This is hilarious. So the overriding fear of AI and, more broadly, the “technolo…
ytc_UgxGr0f9q…
Comment
When ppl dreamed of the miraculous uses for the computer, then AI, it was purely sold as the thing that could improve the lives of the severely disabled. In 30 plus years of development and coding, we have only one disabled human who recently began to coexist with AI. So wtf is the true source of their motivation? They're not trying to simply create AI. They are building a god if you define one as something that can answer every question in an instant. The outcome of which could destroy our desire to learn, improve, compete, and grow. This would be more devastating to humanity than war.
youtube
AI Governance
2024-05-08T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxMcNwx2fFt5M5NOjB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzw2V7_R4BdVVArXNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxxJjF7Y9o8wMMFg8F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlcjHdc1arOhAuyFZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAGGsv58Tjla2Nsn54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzk11GKj06G3vS6zrh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy_8Lk3fax7VwXO0IB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1gK4FcwUPp2GDlaB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgybRlvRh1uGamF3-dp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwP6qUND4yj5xkfm794AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]