Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How Are we suppose to fight when the robot have aimbot like accuracy?!?!
I just …
ytc_UggLZ6M2z…
G
Someone should make this conversation with every AI, to see if the pattern is th…
ytc_Ugw8jKqIf…
G
The robot said we are not made of me we are made of electronics and that's bette…
ytc_UgzVQVLDX…
G
Now also wondering if all the news/interviews clips in these videos are actually…
ytc_Ugw5LPOJB…
G
Too bad you did not have useful skills, AI cannot replace a plumber Carpenter or…
ytc_UgwpY2Xly…
G
He clearly is saying he's using AI, so I don't see a problem here why are people…
ytc_UgyqzNQB7…
G
It wrote the prompt, but the best way to get good results for a complicated task…
rdc_n0o9k27
G
The industrial revolution as we call it in the west, was the most horrific explo…
ytc_Ugw3uXoZ0…
Comment
I’ve always heard that we’re (the public) 50 years behind on technology that our government/Military is already using. If we’re 5-10 years away from AI taking over are we truly already there? Maybe it’s not AI that will back itself up but some mad scientist dictator type person. This scares me more than nuclear war. At least with that there’s a chance that some human with common sense could stop it.
youtube
AI Governance
2023-07-07T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZfFJJCPiXRjmMxgl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugya96xncARCDGs5nDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiaXFFlGp5iseW-pN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzeWz3vOtBBlHXqtYd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyR_phxuA-QQdGET2J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEv_30Qtz-BNiQFpp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx59qUk0UpFx0n1Gyd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuecJIyJvXW9QPEKZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz_sh-qNkH_yRXolWZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-AERNMFRP_lJ9D1l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]