Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a LOOOONG way from replacing developers. Systems are just too big for it e…
ytc_UgxyXrsXS…
G
Can you even imagine politicians agreeing to giving everyone a minimum universal…
ytc_UgwtuXRT6…
G
7:45 ish, i find this a very interesting topic as well. lots of people would say…
ytc_UgzFdL847…
G
„But how is it different from an artist browsing multiple images to get inspirat…
ytr_UgzLGIrz6…
G
I have to add, after some more experimenting with AI tools, there are AIs that d…
ytr_UgzyKHTtt…
G
The chick robot ignored the interviewers comments to answer the other AI and the…
ytc_Ugy2YZy0e…
G
This is deeply flawed. If AI automation makes and provides services, who’s going…
ytc_UgxHik0uj…
G
I wonder ... If all the AI data centers were destroyed, would all this just disa…
ytc_Ugwc3gPiG…
Comment
None of this anymore shocks me apart from utter ignorance of majority of people. Also, whta is not clear to me is when AI replaces us all, what is the end game? who will be buying the stuff produced by the AI. we will not have jobs, hence will not have resource to purchase. We will die of powerty.... will AI just kill us as there will be zero purpose to keep us alive? then what? it is all self distrucive. The end game is not clear..... apart from total distruction. SO is that what the likes of Sam Altman and the rest of the pack whant? Is it their end game? Or are they really that stupid and self absorbed that they have a rocket ready and a back pack and believe they will live on MArs when they wrack all on Earth? POwer and money will be useless to them when they have killed live on Earth...........
youtube
AI Governance
2025-09-05T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwBHoCc-ZmAi0WrlYt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw45vUD8bnPPqzsB-p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhRqlr7rkzkKtcWuZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwJqQu5OT0phwQc-_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwc3lrC63EynRm_G814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwl7JPJUxMN8QMLUR94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyh5e_MBiSQrIADwzF4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCEsxJjwL1FVR_yEJ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzvMKAl8NAyTiQvw954AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxs83NwUbXaukgZN-14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]