Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an acclaimed AI Artist, I must say in certain ways it's harder than tradition…
ytc_Ugyh9z6m4…
G
27:54 "The research suggests that people working to help us avoid the worst risk…
ytc_UgzdP-mYV…
G
@Meee22222no look at her comments page she a bigot who claims bullsht without a…
ytr_UgyHuWlYd…
G
Evidence from LLMs shows an AGI trained on human data would copy human behaviour…
rdc_kqt27w7
G
I think what people don’t really understand when it comes to AI is that there’s …
ytc_Ugyw23krC…
G
I mean... improving those communities wasn’t necessarily a bad thing. It just di…
rdc_eracukv
G
This isn't true. The AIs start out with a generic LLM, but at least for nomi, a…
ytr_UgxzBXR6e…
G
Hypothetically, what if.... they created AI to destroy humanity. Could of sworn …
ytc_Ugz-_IE6r…
Comment
AI should be illegal and abandoned. The world does not need it, humanity does not need it. The day that the machines we build start believing that they have rights is the day we have lost control over our evolution to a microchip. Futurists having been warning against the dangers of artificial intelligence for more than 75 years, apparently no one was bright enough to listen.
youtube
AI Governance
2024-01-03T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyR3gdPZJ6SjTbUJAB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwcaqKke_VXYlbISdV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx-Fv14CyzJepDRTEt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSwDDIOGCqBb1Eqnt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY39UOrmtntB8vu8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSNBLP9nhTEMBIb1Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwSf7omFfCaoGODW7x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzgcFabi4wm5gGLEi94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzvgLR3Aq3OORGYZS54AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx6R0G1DeOqaaQpWip4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]