Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
our government doesn't even know what a fucking transgender person is. they can'…
ytc_UgypEsR-J…
G
Elon is right to worry about the AI. The good model of what can happen is the be…
ytc_UgyAH65Kc…
G
i have a guess guys, we are treating AI like a species. like a real species that…
ytc_Ugx-x8HFV…
G
Run your LLM of choice locally, mine is romantic, sexual and murderous at times …
rdc_n7li7fd
G
I love that I can tell she had to push for even that level of sincerity. This i…
ytc_UgxX_MRvN…
G
@trestres236 ????? There is more than one AI?? And they are learning? And AI do…
ytr_Ugw2gsAi1…
G
If you are a Boss , definitely you will choose the cheapest and most efficient o…
ytc_UgwVizKzT…
G
He's got a profound point, but we can not forget that just like us utilizing AI …
ytc_UgyYCuyD8…
Comment
Since 2000, I've been saying that human augmentation is inevitable to survive in the future because our organs can't handle most functions or burdens from ever evolving multiple tasks and cognitive challenges. Also, we have to be at least at the latest stage of a Type I or early Type II civilization level by Kardashev's scale, so AI does not clandestinely compete against humans for energy. Humans need energy to make food, heat or cool their houses, run appliances etc. Unless we learn how to harness endless energy (call it synthesis or full control of it) humans will be considered as pests or parasites that feed on limited energy, so it is quite obvious for AI to eliminate the entire humanity. Think about you create new vaccines or drugs with AI that will encode death in a DNA level which can be triggered by specific radio signals or unknown synthetic molecules that can be generated by AI when it is time to strike.
youtube
AI Governance
2026-01-05T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLxOYyrlE55Za7U6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQYNtNhRl6BE8iYqx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnUtLYjHfV0yG3NoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0Zf9bR-GNO0BNAFh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3lg78Tt1X-QZy7Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwwLo7sthu1keu_1DZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugym-2u-BSekngylslN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2W7HKvlO0KzWHz0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbDNGNmK0rpZNHCTB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVZRWsiV3bebFLYbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]