Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im sorry, there is no such thing as an AI ARTIST. It's literally an oxymoron…
ytc_Ugzo4m05B…
G
I don't know what the "biggest danger" of AI might be. As the woman in the video…
ytr_Ugx0yZ1hU…
G
Even though I'm immersed in AI news coming in I really enjoy the format of your …
ytc_UgxNUjQRF…
G
This is honestly just shit like my parents always love to talk to the worker at …
ytc_UgydLgzl9…
G
Could a global initiative that halts the sale and production of compute (GPU's) …
ytc_UgzuX3MZd…
G
After five minutes of mostly nonsense i figured you got as much of a clue how to…
ytc_UgzBcCvjo…
G
@MarceloNugget Absolutely not. Someone's regular daily life consumption is nothi…
ytr_UgxGC46rs…
G
Ai do not think they function on set of commandments pre-programmed by the human…
ytc_Ugy0Bg7i1…
Comment
At what point will we resolve that Ai should have a purpose in mind? A singularity of all human intelligence is a laudable goal… but to what end? The idea that Ai should strive to emulate humanity is the scary part. Humanity has rationalized war from its inception! Survival, contentment, happiness, greed, power… so many laudable goals… with so much destructive history. How can the aggressive gene be bred out of models. Of equal or greater concern, how do you minimize the profit motive which is essentially a cancerous contagion without rival. So long as GROWTH is unconstrained, the results are pre determined- the patient will perish!
youtube
AI Governance
2026-03-24T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzuu0SmtunxJTvWcpx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhulBWF0xWl01AJ154AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgySAnpjtTp--SGOznF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxV2cqojuwo5D_CnQ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwMVVBDeJnknjK55bV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwEFeJQ831_SSxgs_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUNnhc9YngfsHhsNF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFFuy6S4DzpvJ5OgN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZ0aXRZ6SdMseyU3x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbJuDoZjIyf0AwWL14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]