Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fact that under capitalism the idea of automation being used to reduce work …
ytc_UgxClAQhT…
G
49:41 dude lost me here, four legs vs two legs is a very poor rebuttals to the h…
ytc_UgwXcsnuA…
G
People who are "born with skill(s)" always can improve. People who learn the ski…
ytc_UgyGYh2J6…
G
Maybe your business isn't profitable then if you can end up having to pay so muc…
rdc_czlbzr9
G
Anybody who uses AI to make art is a fraud, literally taking credit for work of …
ytc_UgxmgipcT…
G
The only good use of AI is for shits and giggles
I hope SORA adds a watermark…
ytc_UgyDVUsTQ…
G
@TheJeffKirkley well, if you spend millions of dollars on development an algorit…
ytr_Ugw0FgH85…
G
Owners of AI? Why would AI want to ruled by some humans? What stops AI becoming …
rdc_mxywb09
Comment
World Wars pushed the Binary Code and Computers, also Pushed the Nuclear Power, we destroyed a lot in the progress, "we simplified life" thats the story they sold us, sacrificing our ways and connections and humanity and creation. AI became a reachable Goal, the worry started long time ago, we saw "the matrix" "A Space Odyssey" "Ghost in The Shell" we thought they were fantasy, we hurt earth doing our progress then digital world became so important in our lives, more so in covid, and then the large developpement of AI and how it scaled lately, and the energy required to reach Super Intelligence (nuclear power ofc) it all goes into a very logical forward path together, i hope the human greed wont end up ruining us.
youtube
AI Governance
2025-11-02T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxNWrT61uzo2WmM-gl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZMEfVoFkVkfTJTQ94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwI5-k1zTFYOGGsqgx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzMAAGqVypPD4TA0ch4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxpqZSsLDhc1WoIPQd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbdWb7Ip6ZtpG_m7d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzaCXgbbgzl97J7nt94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyrfekms2e8Hpk2I9t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyca8mYWfCP3f8mJ854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzBN6XS9bOh7wfEyQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]