Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I swear that top exec thinks he's f****ing Mr. Greer from Person of Interest the…
ytc_UgyJL4g1x…
G
From experience... I've seen plenty of younger people rely heavily on AI in orde…
ytc_UgzaAJgwT…
G
As a disabled person who's hands REGULARLY flare up to the point of pain and inu…
ytc_UgzZYppGh…
G
This was men. Thanks for the insight and yeah crazy crazy crazy. PS she invested…
ytc_Ugx0jEbud…
G
Then stop using and relying on AI? I realize that AI is important but it won’t m…
ytc_Ugy8D96sf…
G
The police stuff sounds like the premise of Psycho Pass an animated series. The …
ytc_UgwEGgltf…
G
@AlexW1495 You must know nothing about how people use AI tools, to think that co…
ytr_UgwSzoM6t…
G
The AI model wouldn't go to jail, just like a car wouldn't go to jail for vehicu…
ytc_UgyxJn9rC…
Comment
It is 100% vital for vital states to not be networked. Things such as military assets, power plants, and vital infrastructure should not be vulnerable to AI control as barriers such as password protection will likely pose very little challenge to AI. I believe AGI would require massive compute so if AI did turn against humans we could shut down data centers (AGI would likely not be able to run on something like a phone or pc).
youtube
AI Governance
2025-09-07T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzzBksA-HIgEIzLIbR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_Ugw2OvvGidAsi0DdYll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0IPOJi8-bKTVZ8_54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx19yKP1Wq6KxNI6cR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw5UZoojznyeqCLE5B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3Qmh9LjJnJ5nbyEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXf21WtB9rMYdIlWl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgztwxJXpcd-vUh5jSZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFlC0HH3JX4anz_Td4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIxPKO7roF3p_bwHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]