Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These so-called "AI ethicists" are just attention whores. This guy clearly doesn…
ytc_UgwJPYbjU…
G
So called AI will replace managers before regular workers. Because managers don'…
ytc_Ugyl26rBd…
G
the game itself hints at problems as humans tend to avoid each other by then. to…
ytr_UgxSLnOWw…
G
We don’t know what consciousness is. So for all we know, maybe everything is con…
ytc_UgyRWqD9u…
G
"Geoffrey Hinton outlines the individual dangers of AI, each already deeply unse…
ytc_Ugxy3lXEA…
G
if we’re all training the AI’s (and we are) then we need to train them to be pol…
ytc_UgzMu1tug…
G
Yes, that's the reason for the creation of AI to elimination human employment an…
ytc_UgysHjFK1…
G
Scientist always ask if they could but never if they should. Same goes for engin…
ytc_Ugy-PDDy2…
Comment
I find the first three quarters of this podcast is extremely interesting, and valid. We have something coming which we created and learning and thinking with much much faster than humans can do. Than the last quarter about religions and messing with developments and science, programming, AI, with super natural is something more to be discussed with neuroscientists and psychiatrist.
What we dont know, we simply dont know. No need to try to find an other agent behind that.
(Man invented God, now Dr Yampolskiy reinvent God) .
And yes, I think we will extinct, such as 99.9% of the species, ever lived on this planet. Maybe unregulated AI will lead us to that point by 2100. ("God knows".... :) but I dont care.. )
youtube
AI Governance
2025-09-04T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwB3Dea13OYLdzh3ot4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1JtASOi34qrucEQt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzsSpXmb_7GCrvOXO54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy8oTB0-BpHmzJIwP94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx-X5QoJaXEvFaCyX94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxIVP4HEW9Xua7J59F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwILGjTeKbEBeGTxGN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyiHJxnt2yU9vlL0pt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyDH0z5XZmCcmioRbx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwgMgYwfeBw_C753uN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}
]