Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Only a matter of time before these robot chicks start talking back and saying ho…
ytc_UgzI9no8q…
G
The reason why OpenAI and IBM are pushing for this is because they are afraid th…
ytr_Ugzlq40G6…
G
This article is honestly so funny. It’s an opinionated retelling of a [BBC artic…
rdc_l9wnqzc
G
You have to wonder...
If AI can co-opt music this easily (because apparently peo…
ytc_UgzPV5OZB…
G
See the thing is we wanted AI to do rhe work for us so we can so our hobbies and…
ytc_UgwZHojym…
G
Yeah it sucks at refactoring functions in complex code bases. I’ve tried the bes…
rdc_n4e6wsm
G
Elon and friends should tell themselves “NO, I SHALL NOT BUILD TECHNOLOGY THAT D…
ytc_UgyQ5xVZ8…
G
People who don’t understand our easily influenced the ai is to cater to our word…
ytc_UgzlnGdI_…
Comment
Did I miss something essential in the video? Apart from the whole meme thing, how exactly are LLMs supposed to destroy the world? This whole apocalyptic „we‘re all gonna die“ shtick is getting a little tiring. We do not have AI as in „it understands the world around it“ and there‘s not a lot of people (who don’t have a vested interest in the hype because it makes them money) who claim we‘re gonna have anything resembling an AGI anytime soon.
youtube
AI Moral Status
2025-12-11T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxoPoTlXCFI1dUOTDJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx4MOq-owUpi0-ZEBJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyGlkjLMkCDfMkYFhd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqVdp213tr8vJFnU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzR5UQT7CqnVeh9IB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqrSQyJtwwfbVpXtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpf-vxcrgb1T0odzx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyJmj2gRAhFJ31m_VV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwI6TAkANw8QatwkYJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOEn6CQQWl0pZ9-T14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]