Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is not wrong. It is a tool, just like a brush or a camera. Art is something t…
ytc_UgzH2C7uW…
G
"Even if those in power instruct an AI to prioritize their personal gains, it is…
ytc_Ugwhx-NgE…
G
Hello! as someone who is an artist, tuber model maker, and do commissions. i am …
ytc_UgwmWAbrD…
G
That's great you have a sense of questioning what could become an authority, but…
ytr_UgyPj4TMu…
G
Holy cow go visit the real Cuba (not tourist attraction) and you will be so good…
rdc_f9f64nq
G
we must do a revolution agaisnt the AI
Ai is here to stay? it will stay dead, pu…
ytc_UgzBJ9OjC…
G
the big issue with "AI" is that its rarely more-advanced than a Large Language M…
ytc_UgyIPtN_N…
G
This is just another conspiracy theory. AI will never be able to kill anyone unl…
ytc_Ugw6EgaI8…
Comment
The issue isn't "we" or "people"... the issue is that adults are primarily informed by journalists.
Journalists do a lot in their writing... but framing issues through a sensible lens is not one of them.
As Neil eluded to in his example, who's written the article about how many more accidents Tesla drivers have been responsible for compared to self driving Teslas? Shitty framing equals a shitty understanding.
youtube
2023-08-08T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAcV3-jeGRD8Ee6zx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnChjmSX_yHITtIzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfQrExD2D6_UQHyEp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugykx2wAY5dREF-SFFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxVTuSUCocmKajIjtN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwx1y44FZI776ewm9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwXrz0slgBdwc2zw1l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFCnmOfLiYdmU-wYF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwaemTH8eWUccvEhjt4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyt4Mx2dB7uiJ5TBdV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]