Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The difference between Humans taking inspiration and AI taking “Inspiration” is …
ytc_Ugzdk9XL6…
G
@GharacDurac. Unlikely. Early 80s assembly was easier like on an intel 8086. But…
ytr_UgyRNM7S-…
G
@ are you calling ai "art" handmade? It's not handmade. And he is literally sel…
ytr_Ugxof9FKX…
G
@RustyCog yet still, there are people who claim to be ai "artists". An artist i…
ytr_UgwfxZmHL…
G
Notice: I don’t believe there are any sentient LLMs at the point in which I writ…
ytc_Ugx2hJtoL…
G
Guy lost me at "Bitcoin is a scarce resource"?? hahah What the f.. do you use Bi…
ytc_UgyvFJiDQ…
G
Note: This is system is arriving in Belt And Road Countries (first). Serbia has …
ytc_UgzVZjkhV…
G
Yup. I just fucking snapped the other day when my vibe coding teammate fucking u…
rdc_oaft2lt
Comment
We have the perfect setup for AI to come in and destroy the world. After covid, trust in our institutions and trust in scientific consensus is as low as it's ever been. Even if every scientist in the world said AI is a bad idea. We would probably continue anyways. It's not the average person's fault. Science has become political on both sides. Everyone assumes that everyone else has a motive besides being honest. Even with the side I agree with, I assume I am being brainwash to believe one thing or the other.
youtube
AI Governance
2025-09-04T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz3a47Q2o3jZ4ZKVeh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTNov40IYNhUhULMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjjplBQ1ilwAy4WBV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1g8rFTxmfdUaDZ_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzV3MbFohPzOyY-wl54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvZdzDfRoDnXVwYVl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkZHRHVkT117W9u9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsyk89KjYwb4gPwjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwEWiWxfvKNyUGb9J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyj4gsfVUxJLiWDupB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]