Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
real talented artist don't give a shit about A.I art, because they know their ar…
ytc_UgxdAqq1-…
G
Remember how the Amazon workers were complaining that they weren't making enough…
ytc_Ugy8W05h_…
G
I may be late but I just noticed yt is labeling ai content good job yt!!…
ytc_UgxL9szN3…
G
I've got one dooms day scenario I've actually run into. AI that prompt inject e…
ytc_Ugw0MDQVH…
G
Search eg maze
What could go wrong
Eg bidirectional causality
Soln memory of …
ytc_UgzyKVswO…
G
It is frustrating to hear every anti-AI argument claim that AI creates. An inani…
ytc_Ugzpvy9Fi…
G
who tf pays someone to write AI prompts for them? does OscarAI really think he’s…
ytc_UgwpYqw4P…
G
I don't like the robots if they have artificial intelligence and a body or arms …
ytc_UgjtTf3MY…
Comment
We should not give A.I. enough intelligence to think, but someone's going to do it anyway. If it does happen, we shouldn't be allowed to tamper with it unless it physically threatens a human. We don't want to lobotomize a robot because it offended you.
#RememberTay
youtube
AI Moral Status
2017-02-24T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Uggbtq-WGdMdsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggAjot1l7w9IngCoAEC","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggEmH3Lq4V_vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ughlh2BiQzNAdXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg0tBq-Ha2NR3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghbXQbC6Eut-HgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiaJXOE27QNsXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggWMgkXXwlosXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggATgq0eeHyfXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UghF5eT9DDh8F3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]