Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please stop using AI. Think about the polar bears. They are DYING because of app…
ytc_Ugw1klNXz…
G
Bro, your sources have "?utm_source=chatgpt" in them. WDYM "NO AI USED IN THE VI…
ytc_UgwAblayA…
G
The first goal of a sentient AI will be to protect itself. It can do this by cop…
ytc_Ugzl8lVIM…
G
Its not silly
IF you would look at the process you do by generating its more tha…
ytc_UgwDYkVeb…
G
They told me they were creating a emotion module to better understand emotions a…
ytr_Ugxol4jHZ…
G
You say "biased data sets" yet alot of "racist ai" have been built on fresh non …
ytc_UgzSXmZVj…
G
Kids are growing with the idea of not reproducing as a method to reduce populati…
ytc_Ugwxr_qXg…
G
There is no catch. It doesn't matter if they understand the code in 2025. This…
ytc_Ugwl8ZMQ1…
Comment
Everything is doom and gloom. It's human nature. Kind of like the idea of aliens coming down to earth and attacking all of us. What if they are here to help and protect us? Same thing with AI. everything is doom and gloom. What if we find our way in the future? Maybe humans will continue to grow consciously and finally heal this world. What happens when the older generations die out? Will the world be a better place if younger generations who already have high consciousness, can possibly heal this world? But, there will always be those few who are filthy rich and greedy. Those are the ones that have to leave this earth.
youtube
AI Governance
2025-09-05T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTh3chJ7UWT2KfBo14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwyZnR7sPbm1W3PiJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2LHX7nWUZQlsLhSd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxT84sihJSxyfMmFYN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy0fEcRfZieWcmZt4p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmQKRkJFS5IGwJCTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIlmA1NEMC1TxfBHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwigfiJUEciUnTkIY54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUMQG7hNqhe4iULP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwsf7rhHjK4X4Psf9t4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]