Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"the first thing we need to understand about the whole consent argument is that …
ytc_Ugxk6zuuN…
G
Hello! It seems like you're using emojis in your question, but I'm here to help.…
ytr_UgwBuef20…
G
AI will definitely replace programmers sooner or later, if it hasn't now doesn't…
ytc_UgyQZ_it6…
G
About as terrifying as a toddler holding a knife. I'm familiar with this AI and …
ytr_UgzMeRFJx…
G
AI is the ultimate Green Agenda.
Green as in they will not need 8bn of us. Whic…
ytc_Ugxl9mlG1…
G
Tous mes remerciements à CASH INVESTIGATION, incroyable travail de recherches re…
ytc_Ugz7kgGKO…
G
As someone who is a digital artist; AI art doesn't bother me. The main reason th…
ytc_Ugxjt_IPg…
G
AI is unreliable. The answer is what the AI mentors and gurus want it to give to…
ytr_UgyRmSV73…
Comment
Humans eliminate themselves by using convenience ( writing emails and stuff, getting answers ) - becoming reliant on AI. We are moving into a society where humans can't contribute because your skills won't apply anywhere so we all will turn to AI and when everyone has power, no one does.
youtube
AI Governance
2026-01-15T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbaKv2FZHU3r47Yl54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx03xeSvcA3tvxDe8J4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBbDqCY00NXAxqudB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfvN3wHsusMdQeGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdHR0_SE7w3YZEJSx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxi-3QgEzZoSvEW31Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-2M0mI6w0u7VClVJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpneseKbAEZrNpk994AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxyMi9mke_UR6Bjk7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXSe3W28WT2nQEVcZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}]