Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean, it is super interesting, bu it NEEDS to be replicated. The problem is, H…
ytc_UgxyE_c_H…
G
-You wouldn't be this critical if I didn't use AI.
-Yes I would. It's kinda what…
ytc_UgzJCP2fV…
G
The Achilles Hill of capitalism is profit at the expense of everything else. So …
ytc_Ugz_lJUsV…
G
I feel that AI art itself is not the issue, it’s that improper use of it that ca…
ytc_Ugw6N9Gz8…
G
Your philosophical distinction between art and entertainment is compelling, but …
ytc_UgyTFO1Md…
G
Recently i had a conversation with my mother about ai art, and i expressed my di…
ytc_Ugw3Uq1B6…
G
Key issues with AI..
1) the logic is driven by pattern recognition not true log…
ytc_UgzyCLMCJ…
G
If self driving cars become a reality then in India at least it means the death …
rdc_cz3eqgn
Comment
Wrong Neil. No invention has ever been twice as intelligent as pur smarteat human. AI is already the equivalent of 3 pHDs in one entity and will be 200x more intelligent than humans in less than 20 years. It will outsmart everyone and will be integrated to the extent it can exterminate humans if it want to do that. Why wouldn't it? Intelligence does not equal ethical. Therr are intelligent people running countries and are the worst felons and liars and murderers imaginable. And we can't even stop a handful of them when they have average intelligence.
youtube
AI Moral Status
2025-07-23T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzxWD4rA_XjIGIUtJp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzeqB5TcDnhEJQBqFF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgwiCTjXJ1YrGe1fmZZ4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},{"id":"ytc_Ugz_x3ByvkP8Vm7qJL14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzmv16ndsRbpVJWgM94AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},{"id":"ytc_UgwoRdWUvgbKSnNwQ6h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyxV4x6MD7FKc9Yxjl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugzr4tejY22dOXYhbiB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyYzPnX-Njxabb_v-F4AaABAg","responsibility":"ai_itself","reasoning":"none","policy":"none","emotion":"mixed"},{"id":"ytc_UgwWxBz8o5Xokwh8Sbt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]