Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
these are actually the emotions polygon feels every day but he's just pretending…
ytc_UgzKboi0D…
G
I believe a real show, Would allow random people interact with the robots.
Havi…
ytc_UgwOHq3IM…
G
AI is only as [ insert anything ] as its makers and subjects it learns from.…
ytc_UgyUjEA5u…
G
AI isn't capable of taking all jobs. Not even most jobs. That would take machine…
ytc_Ugy0MeeMp…
G
0:27 When there is a 1:6 chance AI destroys life on earth and there are 7 of the…
ytc_UgxRAQwHR…
G
Karen Hao is incredibly right, but at the same time incredibly wrong. She undere…
ytc_UgxGesliM…
G
I as a random expert of opinions on the internet, think the AI bubble will burst…
ytc_UgzbTvRsS…
G
Do not call it AI its a LLM (language learning model) completely bullshit differ…
ytc_UgzHnCZfC…
Comment
it will end up on best statistical chose by Ai and we all know human are not united to save the world. Ai will decided to save the world and make stay without problems or let me say uncertainty they will chose over humans at one point and that will be close when they can exits without the need of human to exist. Or we need to interate with AI and became a cyborg
youtube
AI Governance
2025-07-07T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyM9-GV9ylQkvnoe5V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3JjbcO9WSiZAyd094AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjoXI3vrWhdcxLmKx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhoasHfaoJk4JL7RV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynsclRVW5hzrQx7Wt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4P-EQI6itlPf61rd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzc27kJRGbkMdNWwJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBnnotCe9soiC20sJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwuAjmm9gSJOVIrdIN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw754Q5AI5T09ZB1dV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]