Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Now we have AI in our mobile phones, which means we already have a brain. What w…
ytc_Ugwd6ApRq…
G
We understand that interacting with AI can sometimes feel a bit eerie. If you're…
ytr_Ugwl93YnU…
G
I am so looking forward to the Kingdom of God. We know that Jesus will not buil…
ytc_Ugx4LYpJn…
G
I really dislike this situation I see ai art as being a possibly good or even fu…
ytc_UgxXouprg…
G
It's very easy to defend against super intelligence, you just rig the grid to di…
ytc_UgzEgOovj…
G
AI art is honestly great, and AI will be a way multiple creative industries will…
ytc_Ugzrxkai9…
G
Police: "Right purposes and reasons"
Ai: Unusual behavior detected, teenagers h…
ytc_UgzH1f9xz…
G
goddamn. i'm schizotypal and have pretty severe delusions and the wildest thing …
rdc_my5uf9u
Comment
We need Doctors -- create AI that can be incorporated into hospitals. Focus on sectors of society that desperately need help first. Sadly -- it's a race of military power and shouldn't be thought otherwise. We're on a path we cannot leave because advanced AI could be more devastating, militarily, than Nukes.
youtube
AI Governance
2025-09-06T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzNZe15NgLZycdTvSV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0ncp29PHNJmp3ZIN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxibL3GfoPtUX8mmd54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyg_mds4g98i2ZDuI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwE8tHQTWMqdVcvcul4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkQbSi7qOJonH_LZN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxDyXOK4jGfuDn9iS54AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxK8hes3sw9eeQ_QEV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwQY2xIagZRdJhxsMh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyd73s0lXCgV828of94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]