Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Professor Hawking, thank you so much for taking your time to answer our question…
rdc_ctho65r
G
Artificial Intelligence is wrong, wrong, wrong!!! With all of your might, DO NOT…
ytc_Ugz3UPiiP…
G
Photoshop brushes have AI in it. And have for decades. Filters. Post processing.…
ytc_UgxS7r4gf…
G
Without Humans there would be no art for Ai to steal,
Nor would there be Ai.
H…
ytc_UgyQB1Jtj…
G
This is a spiritual battle wrapped in technology. Choose Jesus and you're free a…
ytc_UgxbwloAB…
G
this is a good question, but one that is inherently wrong. the main prospect wit…
ytc_UgieJyVJN…
G
There are signs of me getting laid off by AI. But what are the signs I'm getting…
ytc_UgwoUPDYx…
G
Humanity will always have to make a choice. Will we let our downfall come from o…
ytc_UgyUrdRgO…
Comment
What they’re not speaking on is the huge amount of finite resources ai uses. So I predict the trillionaires will not be taxed or “share” their wealth - what they will do is unleash biological warfare to get rid of the majority of us who will eventually compete with the water and energy ai requires. These white men are dangerous and could care less about the 7 generations after them besides their own friends and relatives
youtube
AI Governance
2026-02-16T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw28JqJJlCxb7E0UnN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwp6psayfOTncgQL914AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyICWzoSsPvly0TSSp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxjv15ifajnPT8G1zh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-fjzmtxENPV342yF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCDGHOQdrFGyB8yTd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtIzUO84mRBEOcHQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwdsCjt8ZAj-OwFz54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb-FGxZFGG8c1_kYl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxoI7RlL27CMYKll8B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]