Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with AI art is that it is simple enough in concept for any programme…
ytc_UgySN7Nm-…
G
@ivankaramasovthere is absolutely no proof to show that there's any architectur…
ytr_Ugyalir_2…
G
Y'all doo realize AI is just a program right? So these AI that's all crazy and w…
ytc_UgzsOr8NA…
G
Great video. On the points about software engineers understanding needs and requ…
ytc_UgzWeDK2d…
G
imagine taking what an ai machine tells seriously 'a device from satan' may the …
ytc_UgxOdQpZj…
G
Me, four years ago: "Tesla's car designs aren't fully self-driving and should ne…
ytc_UgyR5B9KX…
G
15:12
Der Shareholder Value hat auch
kein CEO je interessiert
oder? Werdet jet…
ytc_UgwgmNTP8…
G
Facts matter ::Transition to Grid Power: In May 2025, xAI began connecting to a …
ytr_UgxVuxVpd…
Comment
@mobileapp9056 if I were you, I'd be more afraid of who is in control of it all. A Terminator type of takeover isn't likely to happen with A.I. it's the fact that one day a machine will be doing your job. That means there is no use for you anymore. Why keep you around if all you're doing is wasting resources? That's the real dangers of A.I. It isn't the A.I. you need to fear, it's the people who control of everything. I get why Elon Musk doesn't elaborate why we knows that A.I. is dangerous. If he did, he would not be around anymore.
youtube
AI Moral Status
2023-09-17T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzPOGIbKhXXjKpGS_N4AaABAg.9ulq89HRhV39umEGWxH3NH","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxvkc-VR3LfIV50cWB4AaABAg.9ulpO_Y8Vtt9ump4-Vdccn","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugx1YFz1y020ea4o8dl4AaABAg.9uliYG-HlGO9um40esdTtv","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgzMQChBE6FDfGtmnKl4AaABAg.9ulK-p_JXDZ9ullyHjPmsv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzfB2r9kalqD2Olnex4AaABAg.9ulGpXeTU0Y9ulcBS7_AZI","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxHhWD8ThgoWLqM-Vx4AaABAg.9ukVv84HsHn9ulKtPTmnKB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugznv9b2ANtGRO8W6ON4AaABAg.9ujOs5KDnL-9uk_-E9qI1M","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugznv9b2ANtGRO8W6ON4AaABAg.9ujOs5KDnL-9ukdC4cfK4u","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_Ugznv9b2ANtGRO8W6ON4AaABAg.9ujOs5KDnL-9ukhZlez02E","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxZ79lvv_KYiNs2ksJ4AaABAg.9ufuCLPtOvt9umKqnxXtBc","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]