Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
0:18 OK but what point is the guy talking about digital art tools trying to make…
ytc_UgzTlslgo…
G
I'm an AI engineer at Intel, I'll tell you ashneer is wrong. Slapping polozhenie…
ytc_UgxtDQDIY…
G
I mean have you seen all the stories of people who are like dating AI and they'r…
rdc_nk445sa
G
Back when Ai generators were only good at making vague 'creatures' and faces, I …
ytr_UgyxYETak…
G
Intelligent Man or Machine?
1) AI is such a very impressive improvement by the 2…
ytc_Ugxh47jYa…
G
Imitation is not stealing.
Also, which one is it? Does how the art look matter …
ytc_Ugzp2xm7_…
G
Don't use AI.
AI is a business strategy that makes people reliant on 1 bias pe…
ytc_UgwkQnbgM…
G
So you have evidence of this or just came up with it in the shower?…
rdc_oh37sdh
Comment
People think it's all science fiction, but imagine the smartest humans to ever live. Now imagine someone significantly more intelligent than them.
Would that person want to work beneath everyone else? Would they be willing to take commands from some 13 year old telling them to make 6 - 7 memes?
If they ever successfully create an AI super intelligence, how do they intend to control something smarter than themselves, and with massive amounts of processing power backing it up?
And without much AI regulation & companies being unable to control what their own AI does half the time, there's a good chance it just does what it wants to do.
youtube
AI Moral Status
2025-10-30T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7To3N3bTqWHRXAWd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzg3My9h6MiHmdkDD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzS6P_qp6JJzzMBB394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLgdhp4_xZ5n82po54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMJlOHwQNVVDW5kz14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMu7jkPZ781oZvapV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugxo6c3EvZkZGen8eaN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0MG1VkiFCZxQxg794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3nSuDFDjpcBaDBdF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUrlFSrmKEOxF9n-N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]