Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would also add, seeing so many “I hate doctors” comments here, that the reason…
rdc_jkrsykt
G
But is everyone missing a main point...
If there's no general jobs.. and you do…
ytc_UgxYjrZiL…
G
Can anyone explain why AI wants to 'help' me with anything I do including make a…
ytc_UgxDWsHgO…
G
I love drawing and I can't use AI, because it is betrayal to my skillset. Is it …
ytc_UgxeA_BiC…
G
Concerning AI .... you're about 10 years too late with your outrage, perhaps eve…
ytc_Ugzm-GL3n…
G
In other words Sundar Pichai holds the exact same philosophy on A.I. ethics as C…
ytc_UgzOU6zf_…
G
In my opinion trying to make AI act or "feel human" means your placing human imp…
ytc_UgzLLtOY0…
G
Every person in the comments still trying to defend the shitty handling of gener…
ytc_UgzkKHjCx…
Comment
We’re doomed basically. Thanks for painting the real picture. AI companies always dodge these questions giving BS answers.
youtube
AI Governance
2025-09-06T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw39DMKWzAUsEK0WwV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnzU2gJMzs2tVjjbd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwPFP2QPtthHIW9A9l4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyXTGCVpFaANSTrOLR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgylZPlcvYMeJ_uLbtF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxV7ktbr4k5Pee36IZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgymSbGDsG7pHWw34FJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxx3l1usEYhECc5ZIV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzzX4fKUeBMegcapM54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz669ndLl1zZ--znJp4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]