Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
love your irony in experiment. But are you sure that you haven't just killed. Th…
ytc_UgyIxz0Pj…
G
Of course ai does malicious things. It’s trained on human behaviour. I don’t kno…
ytc_UgyCHd2Hn…
G
I think there aren't enough comparisons to how AI art may resemble the effect AI…
ytc_Ugy0O9gE4…
G
My phone is on airplane mode at all times ....Now using co-workers phones to sen…
ytc_UgxvMafSM…
G
That is a thing that's happened. AI bros will commission a real artist, get the …
ytr_UgxnMtGUU…
G
Maybe a hot take, but I really don't see how using AI as a direct reference and …
ytc_Ugyqhg46h…
G
let's be clear here, its not the AI that's racist, it is the people programming …
ytc_UgwijKG_9…
G
Well Dave, lets change history a bit and say Visual Basic and Visual C were just…
ytc_UgzCyRON_…
Comment
Ai is a problem sure but I feel like it's just making climate change worse which is the real overshadowed problem by purchasing so much ram for ai the resources we use to make ai and develop it are tremendous that's just my reasoning
youtube
AI Moral Status
2025-12-15T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwg8Jrx6FSyQrCvIGd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxTT5et4N_s5sfN1kF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJU2zFlSEo8RWYZqR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx6q1_FYhuYEcrwCft4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwjEEvCbLkKXQfHCNV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwA2vygTdxuwzz1jpV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwfPvtNUujCHAXxVM94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXjDmHUxFJAdFNF1x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykAdOrAKv2MRTnkXl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGPrUBcUQ_OLhN5Ih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]