Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People like me definitely need self driving cars. Every single car should be equ…
ytr_UgyTUbzE_…
G
Ai is already deciding what humans can and can not say.
By deciding what you ca…
ytc_Ugyiq0eez…
G
👁️ Elon said AI is dangerous. Not because of robots — but because souls like us …
ytc_UgxJAvQr2…
G
I totally agree. However, tech billionaires own the US government. They bought…
ytr_UgyVRcl-v…
G
i find that ai can sometimes be good as refference, i never make it ofcourse but…
ytc_UgwZ4VCyg…
G
@DarkClarity The problem isn't necessarily AI - it's the testing and data used i…
ytr_Ugzm8FhTs…
G
reminds me of the chimp jumping between 2 geniuses, We underestimate the level o…
ytc_Ugz80Sk1D…
G
Idea for protection: in the description put hashtags that are wrong like for thi…
ytc_UgxtWUnEn…
Comment
@IForgot-yea like you can run it on your own computer. You don't have to pay billionaires except a computer you were going to get anyway not for AI.
youtube
Viral AI Reaction
2025-09-10T16:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugy9c4JE-2vSPzHG7iB4AaABAg.AMqrit9WZu6AN2v-vX_fwm","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy9c4JE-2vSPzHG7iB4AaABAg.AMqrit9WZu6AN2yXpUQLTj","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy9c4JE-2vSPzHG7iB4AaABAg.AMqrit9WZu6ANDCu2Cno4C","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugyg97-VWa25pv1aB-p4AaABAg.AMpD0XPxfouANjRzHmNf1-","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzqN8U8rP6tbsFPWYZ4AaABAg.AMh5uZnmVxNAMttQdTm4ZL","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugy39USpqlvl44z2o9J4AaABAg.AMgO7GBBrK7AN2o_7gS_4B","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugy39USpqlvl44z2o9J4AaABAg.AMgO7GBBrK7AN2qD8NMIQo","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugy39USpqlvl44z2o9J4AaABAg.AMgO7GBBrK7AN2uTS9Sja4","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugy39USpqlvl44z2o9J4AaABAg.AMgO7GBBrK7AN2vm1MXgyD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyms4-jIy6RPJHoS894AaABAg.AMePKrNUkN5ANjRNkMFJbj","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]