Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We shouldn't regulate or ban AI
This is a Pandora s box.. cant be stop now ... I…
ytc_UgyQs6HKq…
G
Neil: "People didn't get really worried about AI until it started doing creative…
ytc_UgzMn2Yem…
G
So in the future when they truly become aware how do you think they will take it…
ytc_UgxumZw28…
G
Nobody is hiring you without experience and it's not everyone's tea to become an…
ytc_Ugyyqg8tW…
G
"AI is more accessible!"
Michelangelo can draw a better illustration with a burn…
ytc_UgxarIGQb…
G
I think most people will become farmers and fisherman if most jobs really dissap…
ytc_UgxTQdTp-…
G
Ah again the people who do not understand how to use AI, that think copilot, cha…
ytc_Ugy4Tdbs2…
G
Sam Altman the CEO of OpenAI admitted in a conference that in quote “We certainl…
ytr_UgwWlAwQP…
Comment
What exactly do WE value? Do we value anything anymore? We deserve to become the sub-species! We’ve been greedy, power hungry, predatorial, imperialistic, selfish and hateful. I doubt AI will do worse! And how is it none of these tech “geniuses” didn’t prepare or understand this all along?
youtube
AI Moral Status
2025-06-11T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw75zZcl-umD2FEs5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwx9ul-Qw5E_RefHFp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUNbc3akc6xda6Nvh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwWCqxQXfRZxXHhQEJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwucBGL2V0Oq0shWLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrdMcZ0K1UzsjvwiF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypuULWW7t6lT6tXg54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMJC4yRguPFi9nim54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-tggvNnuD8S9A0I54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwb7W_OtoJsiTvMhHZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]