Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro he wanted to turn to his mom but ChatGPT told him not to. Do some research b…
ytr_Ugwk7FzhA…
G
From banning illegal "Loan Apps" to stopping "AI Generated False Videos", Google…
ytc_UgxNS5jA6…
G
All of these isn't an issue in the future when everyone has a self-driving car a…
ytc_Ugz8wJCpF…
G
AI art is actually pretty it’s even wining awards when in competition with art m…
ytc_UgwfCaSe-…
G
I am an I.T. Specialist and it cracks me up when people say A.I. is everywhere w…
ytc_Ugwnoci-w…
G
I don't draw but i crochet and I 100% get what ur saying. When I finish a projec…
ytr_Ugyg7k-Xg…
G
AI will eventually take over, banks, financial gurus, lawyers, real estate agent…
ytc_UgwsFMMZA…
G
If nobody has jobs to earn money, they won't have money to give companies for go…
ytc_UgxfrpYmr…
Comment
When developing an AI, failing to show the necessary care regarding legal issues is never excusable—especially when children are involved. It may even have been done intentionally. Whoever is responsible should be identified and urgently face a major sanction that will at least bring some relief to the victims and ensure justice. The monsters who abused this must also definitely be punished
youtube
AI Governance
2026-01-19T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwym0_MFiI-IjubLip4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOkkVU5ZX3OMAs9FN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgaG2Hy87PtiNi4wV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxGxC8QeP9wEIG1iZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz_yKSW6F6H6uUnuNJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwD5UW-7HFcLrUGLqh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwL3MDK4Mif4yvDk2N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-k9gMzss0areEthN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgywSXahWkB9kLftMEZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_NX4Qk5jM9p1kgfh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]