Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you for your comment! If you're referring to Sophia's design as a unique s…
ytr_UgxIf4QhE…
G
Robots work 24/7 have no personal problems, don't take maternity leave, don't ta…
ytc_Ugwl1IAn9…
G
AI is the Anti Christ. When the government comsa knocking offering you this chip…
ytc_Ugx9vc6_l…
G
If someone can come of with a better AI model that is more effecient that dont n…
ytc_Ugy1iS156…
G
Do I ask AI for health advice? Yes. But I research the answers it gives online, …
ytc_Ugy6pH7EW…
G
I don't know why people think AI is better than normal art but that AI is inevit…
ytc_UgyBlSiJt…
G
I recommend you get a blocker of sorts. I got one for YouTube, it blocks shorts …
ytr_Ugyxlb6OC…
G
Doesn't space and finite resources kind of prevent a post scarcity reality? I do…
rdc_kiuewjf
Comment
It's funny how a man who spent his life to develop a specific technology then suddenly alarms everyone that the very same technology can basically destroy everything. To me it just sounds like crazyness. Science sometimes can just create crazy, out of their head, annoying arrogant, greedy, materialistic, nihilists, feeling omnipotent people. That is your typical tech company CEO/employee who "wants to make the world a better place", talks and seems to reason very politely, then proceeds to invent the nuclear bomb, the next gen weapon or AI, then, MAYBE regrets to have done it.
youtube
AI Governance
2025-06-18T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz9FbfchMBjPT3WB_94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyc-wwKGpuvyRaLD5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTK7WVxVE6s_N19nl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3bNlZZHkP0Z6ejqd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8KyJ17DGmDo6lcxx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwfhsOEvK5NMtwOdEp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy7kf24I0m5XPDTXsV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgyMXSGhHs2qccH4PhB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7joVTsIQjEvaq5lx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyELf_5uSO1YXy7-914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}]