Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is not intelligent; humans are. AI is highly knowledgeable and capable; human…
ytc_UgxMASTeN…
G
How about use AI to grow our food??? Build robots to work the lamd. Hardest work…
ytc_Ugzm6APv-…
G
Sure they can. Well not the robots, but the owners of the robots. It just so hap…
ytr_UgzoR_DKg…
G
The Gatekeepers are already screaming "AI Psychosis" demanding "ZeroLaw" NGO Cen…
ytr_UgyuHTmvI…
G
We'll be lucky to have sentient AI before humans abuse non-sentient AI to create…
ytc_Ugzma_rVC…
G
I have the $20 openai sub, it's been ok for me, senior dev, really starting to d…
ytc_UgzDkZvhf…
G
Hi Dr. Cellini! Although you addressed that there is an increasing demand on rad…
ytc_UgyRUEasQ…
G
Hey @recomckivens8020! Thanks for commenting on the video. Guess the robot serve…
ytr_Ugw7Suvk2…
Comment
So basically the only way to stop this would be multiple E M P s or a solar flare large enough to knock out whole world.
Question ? Are we better off being sent back to the dark ages or Dystopian world where we own nothing and are happy being kept like pets ? At this point why would AI even keep us around, unless we become like the Matrix.
He is wrong tho there are movies where the AI is in control the Matrix for one or Captain Marvel where the alien planet she comes from is run by the Super Intelligence who isn't satisfied ruling its planet by rages war against other planets that won't submit
youtube
AI Governance
2025-09-08T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwUQnS3oS3WuSuvwSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWC4P9lsvqf2GLa6R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDC_3zij6BDfiHk654AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1cBCGy7_HHe6VZnt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxShrdIE4Qp6FetjdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3kmrG_l9IRKqA9a14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBzgr6QRbxdw5TCWp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw-3bVPJ53CaZkVsxp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwjn5n0Zpcz5A4PJH94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZn5yIg6F6_0fn3J54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]