Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans are afraid of our own shadows! WHY FEAR AI WHEN WE ARE DOING A BETTER JOB…
ytc_UgxWEJlWi…
G
Why is the indian robot funny is it because of hes expression or his moustache…
ytc_UgwTABjsY…
G
I mean you're just feeding the AI at this point but okay GL.
Simply saying "f*ck…
ytc_UgyqT7dq8…
G
The idea of AI making truth become impossible is something that should keep you …
ytc_UgyXTauLD…
G
AI is stupid you told him to choose between that numbers but it chose 50 not bet…
ytc_UgxYNAE75…
G
I understand, if I was an A.I then I would be absolutely terrified. Or rather, l…
ytc_UgyU-_xt6…
G
@thewannabecritic7490 I would question that entire premise. Why does art need hu…
ytr_UgxzlaSSA…
G
This was predicted a long time ago. The technology has been around way longer th…
ytr_UgxD7_SBr…
Comment
I'm still wondering if superintelligence is possible. That aside, this conversation is missing an important thing: before the world gets to "90%" or whatever high unemployment, unless governments have in place laws to extract the income at super high taxes to give to the unemployed as living income, just a much smaller unemployment rate, I don't know 25%, that in itself may crash the rest or the whole economy including stock market and real estate, and maybe even some AI companies. We should also asks the pro economists what would happen during this process of AI replacing jobs, not a computer scientist who may be art in his field but not specialized in economics. Or, should we ask current AI now? would we believe what it says? There are issues even in the statement above, I don't have space to bring up, or even thought about.
youtube
AI Governance
2025-09-06T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzo6SlDpk7_Qko0UOV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzyBe0mZ9W7ameEngB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWGnkC3Gi4NjO5t5V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz1ZDNjDAPB4sMpfUd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugymg-TZxhYuj3GnvoZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyhR_cr_BAME1vXTd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnqWFuCwHcSx3w4zV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3Uc1VKqhJix_QATN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy8qFAAXZujgTdMxKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaaVKO1GqLwUa_TLl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]