Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We just recently got AI at our store and its as bad as it sounds. We loose hundr…
ytc_Ugz0bXBW-…
G
80% of all of those stuff is probably also generated by ai in some sort. Impossi…
ytc_UgxMB2CFB…
G
To all of you modern "humans" that hopped on this inhuman bandwagon known as AI,…
ytc_UgzUuwux3…
G
AI is in control of where data center are being recommended/built and the intent…
ytc_UgwmEpQ76…
G
Stupid IS as Stupid DOES . . . .
Example: Putin knows he will be safe BECAUSE A…
ytc_Ugz6sEUyM…
G
I think the real answer is .. they do NOT really want general AI. They want an …
ytc_UgxKoosa3…
G
I understand ChatGpt here. The purpose of lying is to deceive someone so that th…
ytc_UgxSa0ndl…
G
@mac_mcleod No? The automation was an artificial human (doing a human task) that…
ytr_UgzRM6Iin…
Comment
The problem is not Ai the problem is the people who is behind of AI who is bad people. We must to pray to God for just allow good in the world because God our father our creator is good and he just want Good in the world. Ask and you will be done unto you!!
youtube
AI Governance
2025-06-05T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwFT5_LQ2RTLU5KsRB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8Okm0PpleBqEMT7x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7pvc-XWvsV6bZkH94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyw61D5nuPh8VHKUWN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxcRsDHP-vK4Fe5m9F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmK57Jy93t2Bi2n_p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxqS5PnD7jiALLgcTl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyH1nQ5VFhwzpTR9J94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzKHr679RFivxullh54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzDkCIbij38alazgmd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]