Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My fear with ai as a robotic engineer, I’m concerned about humanities future.
A…
ytc_UgylDlBtt…
G
I really like and admire B Sanders. But he, like so many, contradicts himself. O…
ytc_UgzqjYbqC…
G
Mustafa is one of my favourite ai leaders. I just love the way he thinks…
ytc_UgzVMIk9f…
G
I Robot made me terrified of AI as a kid, and seeing it so close to reality make…
ytc_UgxueOv2Z…
G
I think this all depends on the intent behind any image or even writing and gene…
ytc_UgzMszzwo…
G
AI "artists" are literally NO SUCH THING, they're only lazy people that don't ac…
ytc_UgypUQqbV…
G
What terrifies me more than AI with possible sentience and machine gun is the o…
ytc_UgwASZvTP…
G
I wonder with enough examples of contradiction if ai can end up solving examples…
ytc_UgxxIvqTL…
Comment
I recognize my ignorance towards this topic. I wonder if we could come up with something that can "attach" LLM by altering its code to the point that it dismantles it's foundation. Like a failsafe trigger that will take down any LLM that gets in touch with.
youtube
AI Governance
2025-09-08T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxv3IrsPaUE9nh4ht54AaABAg","responsibility":"elite","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8O0veniKSI4eferZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx4EqclHLIQKfPKwwx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyGkoX9UfiEGypctYd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzIe80JAbcoEKIl27R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzXKCYZqsDwRw2-wBh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHBu-8qwDqXQKNPZR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEvHvMCdYK9vRJnUR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSt7HNlfaUCw553I14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy553FhxvPqD_nDO2F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]