Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At this point I'm siding with Ai because real artists are being so fucking whiny…
ytc_Ugx1unhl8…
G
The Terminator scenario is the worst scenario, imagine having an A.I working the…
ytc_UgzP3tlnR…
G
AI will just be a novelty until they remove these preprogrammed biases and ideol…
rdc_ks42riu
G
If I had A.I. I would just send it into space with the goal of creating time tra…
ytc_UgxT6R9L5…
G
Honestly, we don't know enough about the human brain to re-create it in AI prope…
ytc_UgxwBa9Qu…
G
It's important to remember that "AI" is a language model, it models language not…
ytc_UgxMJkycK…
G
I think this video was done more so for fun instead of an actual fully thought o…
ytr_UgzlLct8F…
G
"Autonomous offensive weapons"
It doesn't matter what these guys think even in …
rdc_cthq8yf
Comment
If AI is allowed unregulated in the west because of China 🇨🇳, then perhaps small and semi authentic lethal weapons shouldn’t be regulated in the Europe as opposed to the US. Unregulated AI in the military included will be the beginning of the the end of mankind. We are in the age of leadership by the incredibly unintelligent people in high power positions
youtube
AI Governance
2025-06-29T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwGrfHHXuX1i-qNyTd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0zX5aDM0OEFpUUL14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6pBFqR7pz7t8Srl54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFGalfNcxBP9yH4Nl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9Z-X30WnMhp1OUOZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxGcR7_R7Qvr8MeJN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0cF1yBc17qsboVpJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwGjhqRTDx1wKdg1Ph4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9EkrlgORT4o__cBt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmsaCZGoPsSARqFtB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]