Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Someone who’s really good at creating with Ai 😂 I’m gonna end this guys career…
ytc_UgwWaY6zA…
G
This entire video is a lengthy exercise in anthropomorphism and pareidolia. Miss…
ytc_UgxRqqo3n…
G
Waymo is horrible need humans, robots cannot replaced humans. Horrible idea just…
ytc_Ugy--e2c1…
G
Every time you said “we don’t know why” I screamed “OH, WE KNOW WHY!!”
Their a…
ytc_UgwhVwf57…
G
And there is a reason why they offer free storage. They can use it to monitor y…
ytr_UgxWhpqQY…
G
boost of productivity?!? not sure about that. it’s very hard to change human beh…
ytc_UgzWlT2mM…
G
Tesla's FSD does not have a better accident rate than human beings. Waymo, sure,…
ytc_UgwH8mLj4…
G
I sent a senior engineer some comments on a pull request, and he sent me back a …
rdc_nm0q911
Comment
I tend to draw during class when I’m bored and one of my friends in that class has started using AI apps for fun. Recently they’ve started asking me if they can take pictures of my art and I don’t really feel comfortable with it because I know they are going to upload it to that app. (Honestly I’m not really comfortable with people taking pictures my art in general,) They’ve even once asked me to draw them something so they can use it for that app. I asked them if they were going to pay me and they probably thought I was being crazy or something. I honestly don’t know how to tell them I just don’t agree with it because I know they don’t mean any harm.
youtube
Viral AI Reaction
2022-12-25T22:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxaxv3fk9Jpd6JsvgF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwOwbmzhYzdrdp_wgV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw5kTWszqPZ2jvou654AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxriWVjNlNlQK2obFB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpl5TCuQCSGlyDwJR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw6mJ3tjy74s8WQVxB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzR9YsZ3U35NW3YxV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzN8Fhjb4a5sc-APGp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxfNeqHpy0CYCpxgQl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"disapproval"},
{"id":"ytc_UgzZ-lIu5OHL771pYxt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]