Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It costs me and the environment far far less to make fun AI pictures than for a …
ytc_Ugw7WmTZO…
G
Unfortunately the worlds governments are in a desperate race to come out on top …
ytc_UgyXfX7I2…
G
Self defense does not equal amoral sociopathic behavior. All consciousness has a…
ytc_Ugw_zuvy3…
G
1 ai prompt takes 10 times as much power as 1 Google search and ai databases are…
ytr_Ugwusu5Zr…
G
I'm afraid of AI. Not because it's "slop" or about the "soulless" aspect of it. …
ytc_UgwrDLzi2…
G
AI has helped me become an artist because I can ask it something like, "Which gl…
ytc_UgxkB1H5c…
G
I read the FAA is expanding their facial recognition usage after they found it t…
rdc_jv5z0hd
G
I have an AI generating 100% of my company's code. Doesn't work. Never gets revi…
rdc_oht3qx8
Comment
The REAL worry with AI is the same as it is with fossil fuels. Not that it outright "destroys" us. But that we become dependent on it in some innocuous way that we do not realize, but that slowly creeps into society in such a way that we're unaware but leaves us in a position just like fossil fuels, where we suddenly find out they are a huge problem we did not realize due to CO2 emissions, but we cannot stop using without billions of people starving to death. Why will we be dependent on AI? We wont know until it's too late. How will turning it off fundamentally harm us? Again, we wont know until its too late.
youtube
AI Governance
2025-10-15T14:0…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwndzV0b_pd872B6MN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwuUq5LPr_Cy97_pHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx5KNSzBlJHOuDjvSR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx5BJ1c2qU5Enjt9sd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycAg8VmvNgc8Y73X54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzof8gms3CEewnikQx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzzZ9Q9QQSdcHk3Ycx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzl3OaI9Eh4nLbYz7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzpFbTHOANObjowEVh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy0aa2SaObu4P3est54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]