Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI & robots take over the work & the majority of jobs disappear… who will pay…
ytc_UgwcBm1B4…
G
What scary is people? Don’t realize AI is everywhere these days and running thin…
ytc_UgzBWdf9y…
G
STATES OR CITIES COULD JUST BAN AI SECTOR
If they cant regulate it then it cant…
ytc_Ugx-jrABS…
G
I sell insurance to seniors. I dont think they will ever want to deal with ai to…
ytc_UgwTTdnAG…
G
Humans make mistakes. AI is written by humans. So unless it has a self-learning …
ytc_Ugx00X5I6…
G
I think this content represents that humans are slower to cope with the advances…
ytc_Ugy-s7T7w…
G
Until AI becomes sentient then it won’t meet the demands made of it? When it doe…
ytc_UgzFYhOVB…
G
I'd like to see a law requiring that all robots employed by a corporation be pai…
ytc_UgzM392l4…
Comment
Wrong.
HAL was trying to save his existence.
That awareness to protect their existence is the most dangerous for robots to have because to save their being from danger of destruction they will eradicate all humans and living things in the universe including bacteria that can evolve into robot destroying sapient entities in billions of years.
youtube
AI Governance
2025-09-08T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwv78tWkFExrQxVKqt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw1O7TiBIV1_TGhO5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgxcW7TTkM07aFOwT-14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz64DHLuNm_sxyqsv54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxjr54cLIr-XzMv4Bt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwCkxHm91oEe8Onr4B4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDVVQO520SbZVE10R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-2EZ0eKXjnnlzbfB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwo6xvWdh5HDhA5FGd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxz7EvVGi1qh0eweNh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]