Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
many people are like, find a flaw in AI and announce that it'll never beat us at…
ytc_UgyMiZwBF…
G
Microbiology is a lot more than just microscopy. Sure a machine and AI can run m…
ytc_UgzXaylrs…
G
She's a numpty. Her answer to the value of advertisement had as much value as th…
ytc_Ugwbbv7Sd…
G
Bernie, you are feeding the AI hype machine and promoting Musk's wacky ideas. Th…
ytc_UgzJMW0eE…
G
"The probability that we rush ahead" - they were speaking of this as a risk to e…
ytc_UgxqV2Vek…
G
Are you really suggesting YouTube finding yet another way to not pay content cre…
ytc_UgzwyARS_…
G
Once this stuff is made illegal, the happier i’ll be, like they are actively inf…
ytc_UgwQOj0_e…
G
Touching on something that was saud at the end... how many ppl now actually come…
ytc_UgxF0qXLZ…
Comment
Terminator-like robots are not as much a concern yet because self balancing as well as a well Integrated power source for the robot, is still being tackled. Very few companies like Boston Dynamics exist. And the robots are extremely costly and not even commercial yet.
However, if an AI controls the banking system, transportation (especially trains and flights), and one day decides to shut everything down, BOOM. The after effects will be enormous.
That's the more terrifying doomsday, because it's very difficult to fight something which is not physical.
youtube
AI Governance
2023-04-19T09:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxhu-LVQ-4tXDbVz5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyH8KUa42WONCXZQ9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxa2NNKBfdGi4r51SB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEYq4LQbsaEtS_-L54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1L-00R5B6ceNzly14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGVwB561UWAfTaYbl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzq2PjiNZAI7LrkVtV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4nM346Oyq--btG2J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVcGuat0C5PogVJ2F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzR9MTAj1_zNK2ZDyd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]