Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will never support these disgusting companies. Thank you for bringing this to…
ytc_UgyiEiMWU…
G
after AI will be a global war, because the world doesn't need so many people. An…
ytc_Ugyfg5aJl…
G
There are only two conclusions to be drawn from all this:
Either a self-aware AI…
ytc_UgyXMCtCz…
G
theres something beautiful about using logic to confuse ai which is an entirely …
ytc_UgzOQj6b5…
G
I felt that way, now I've started feeling like throwing those most responsible o…
rdc_empgoah
G
The scariest part is researchers disagreeing on what “safe” even means. AICarma …
ytc_UgxrVml1Q…
G
i can only access the FAQ part from there, there is no option for settings…
rdc_mcvkd6d
G
I think you dont know how ai works or you dont understand language. It just pull…
ytc_UgwIzR9VB…
Comment
I agree, but look what happened with the last vote. Half the country will vote against their own interests, largely because they believe what FOX, or their pastor tells them. If someone like Laura Ingraham does a show and says "Self driving vehicles are totally safe, and will not take away jobs for ordinary folk" they will believe it. Not only do we need to be able to vote on it, but we need to overcome the ability of corpos to ALWAYS set the narrative. (especially on the right where they won't even be challenged) people need to make INFORMED votes, because when they don't, we end up with a Trump/Elon situation, and then 95% of us lose.
youtube
AI Jobs
2025-06-02T17:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxbbY4Ty1zSOKz5Ac14AaABAg.AIuKepddBtnAIvWQXPWTRD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwcXwcT6O5vo1iY4AZ4AaABAg.AIsqP2Swx_8AIt3enFq4tW","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxUXhgrxKFcice6dnV4AaABAg.AIsembRZiA1AKlv0JfiY8x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwiJM4xAJKjL9xqWBh4AaABAg.AIqDr6uD_ikAJ518W7oCKi","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgweTnQNb1q7BHag4R14AaABAg.AIpyE-eU4o4AIsaeq36t67","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzlPXHQV9Qq3g-p8-94AaABAg.AIpo_LF48avAIt92jZ3gZ2","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugx2JJivmBWWJztPrZJ4AaABAg.AIpeXHRr86KAIpeuActuUa","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxqyDLfW4rPOczBFgt4AaABAg.AIpX2jZrhdvAIpnJX3r0-r","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzS41nub0sThqb8_8p4AaABAg.AIpQa1l3SizAIsZ6kSMcu-","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwbWCSTEUbVD01j5_x4AaABAg.AIoGwBy0_q_AIoNe2bhobF","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]