Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
THIS!
But there is a study: if people read Art is made by AI. They hate it. Even…
ytr_UgyzHXt4_…
G
let’s just be real here, they’re suing just to get some money and attention, no …
rdc_naszdad
G
The inevitable bias of all AI LLM is reminiscent of the ancient parable "Blind m…
ytc_UgxI-TakM…
G
That may happen, but it's not exactly "realizing" that its answers were wrong. R…
ytr_UgysmdyIy…
G
https://youtu.be/G-0Ot6NHMYA?t=216
the position of the daugther is incorrect and…
ytc_Ugwr_p1IG…
G
His insights on AI are absolutely fascinating. A couple of years ago, he boldly …
ytr_Ugw-JB4_f…
G
I also hate that they now have AI on Google Docs just so it can train through ev…
ytc_UgxPMmnHJ…
G
We appreciate your concern about AI advancements. Rest assured, here at AITube, …
ytr_UgyalRMsP…
Comment
I don’t understand this.
If everyone is saying AI is so dangerous and it seems to be a common belief then why don’t you simply stop and as a result won’t bring it into fruition? I know the actual reason why is because someone else will or for money or for power or whatever. Though it does seem a bit weird how this technology is so destructive and dystopian and end of humanity, yet we continue to build anyways
youtube
AI Moral Status
2025-12-17T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzv-V7cAEV_Ci0E5il4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwTxwmIUR8N33DOaSF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2ctYIT80LWOv-B3p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5BO4NGh33Q4Wq6CB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVvOF9x8j_WVbfH5R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzLUH_ZMs3iSfIDMl14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGqqKJwIMJIFoSkDt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwlzf1CPv_TojrI3rp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNjjMIh-tL_UOzTTF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx5n2J_6wtUvH1IaQV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"}
]