Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its ok guys. We can ALL become influencers. And we can ALL repair the machines..…
ytc_Ugyy1Ev7p…
G
Is AI going to drive the school buses
Fight our fires
Delivery the bread and mi…
ytc_Ugw2uBq5h…
G
I wonder how would the perpetrator react if they were to be given the same treat…
ytc_UgzwG5IQE…
G
@DefaultFlameAlright let me put you in a way you understand since you were so c…
ytr_UgwERnr1K…
G
Please start creating "Oracles": ChatBots trained on specific people's books and…
ytc_UgyoVPZk-…
G
Its NOT AI that they are speaking about its, those who control AI. Its HUMANS/pe…
ytc_Ugwsmi9qt…
G
I think it would be unethical to continue progress into A.I. Think of it this wa…
ytc_UgzZyZyoc…
G
So one day I'll be teaching my robot AI plumbing apprentices how to plumb a buil…
ytc_Ugz01eLzy…
Comment
Another analogy I've heard:
Consider this. Ai is burning forests, water, and resources faster than ever before. It is preventing people from using what they need and only giving people money. In this world, It is currently like a villain.
Now, imagine this. You see a beautiful peace, it looks so very pretty and you can't look away. Then someone told you that a terrorist made the art or som'..
.
Would you... still want to support the art??? There are far better analogies but this is one of many to cover more moral reasoning I guess idk.
youtube
Viral AI Reaction
2025-10-24T05:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwvpSAIkBu1Av0Fc6N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOlNBicnTKL-raeQF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzQxKg8bDbxdtWLmR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkY_hZDe5YUwcaMSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhVRcMBcaHJ_vFKix4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaXobIFHXpFB9XxAt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxskXMYYvvkoO3I--Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwGMx7g9LdIdSwC3Bt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGLACbdEH4M5cArNp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxByW3TZNiA66XF8dd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]