Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
EXACTLY THIS, Gen AI has a LOT of potential, but is being used by powerfull peop…
ytr_UgxSRDsn8…
G
Maybe if the AI in Palantir kill programs used by the IDF in Gaza were to be rep…
ytc_Ugxb3mO5U…
G
I think most people are pretty satisfied. Hes running a minority government. So…
rdc_fn5fg07
G
but for real.. if most of them will be replaced by ai... at some point there won…
ytc_UgzX_Fd3M…
G
I think we need an AI-model that will keep conversing with all those morally con…
ytc_Ugye3w1EE…
G
*I like how even an AI can acknowledge how misleading Jordan Peterson is. 🙄 Basi…
ytr_Ugy6eb8tM…
G
I don't know about the rest of them but Google's Gemini is wrong a lot. It's a m…
ytc_UgyU_RNF2…
G
They usually use that argument that you know nothing about our field you know no…
ytr_UgxwxZVJQ…
Comment
So let’s say then that the likes of Tesla create robot plumbers, it follows that they can invent robot other professions. But doesn’t this mean this is unsustainable? Tesla would eventually destroy its own business, it would go bankrupt? Robots take over humans in physical jobs so you don’t employ humans anymore. Software ai bots can take over the roles involving computers. So if we have 99% unemployment, who will have the money to buy Tesla robots etc? No one will? So Tesla will lose business and revenue. And this would be replicated across other businesses. In fact you would have 100% unemployment because AI can replace CEOs etc.
youtube
AI Governance
2026-04-16T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXu_juwU6nqVgApKB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRqGx28NGPaFeTyAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzlidWA8eRtBJ0DqfJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1mxS9NJO27dT8cOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKqDgBycZhFwOJK7F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwj00gSEUDmjL7MaM94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzN-UcIWyG9WiXRwYV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEV9xxClYiVYRWu3d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgzuV1JZQ2xRbzl2OH54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxnucDc7OWdhdz3Pxx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]