Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just a reminder that the current flaws in detecting AI, are already being "impro…
ytc_Ugzie2TYu…
G
It's not even real. No company or science lab has ever come close to producing a…
ytc_UgxftIG0H…
G
Je trouve l'AI terrifiant... je me demande si on a pas ouvert une boite de pando…
ytc_UgwzyQxmf…
G
... Asking PEOPLE for content is 100% understandable.
But when you go around a…
ytc_UgzBsnuTg…
G
When I say it can only operate by stealing copyrighted content, those aren't my …
ytr_Ugyai-_D1…
G
Think of it this way; we are going to kill ourselves with or without AI. At leas…
ytc_Ugw73jQ4i…
G
Thank god we dont need tatlor swift anymore anyone can maje a lable bow and just…
ytc_Ugw0vcRad…
G
I wish IT people would stop working on AI. It's seriously killing our way of li…
ytc_Ugxj2cXUj…
Comment
Why isn't the news for the Boeing 737 Max mention anything about the use of an AI system? I only knew it is because of software faults. Boeing should be more transparent with this AI system usage. They should at least use and test flight more than ten times before selling it commercially. Why the design faults only happen on two third-world countries? Everyone has the rights to know about the AI system used in this plane that they are on board. They should announce that this plane uses an AI system. The thought of AI's lethal weapon really strike a core to humankind's fear @8:01 and the commentator highlighting @8:25 was scary.
youtube
AI Governance
2023-03-19T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzLVFjDsJGJD471yYx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxSAtT93udw6ci31_l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxWhzK0jyYuQmh5LBZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRgjZzrNT8Mr7XtKd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxh8YHVVmyk3PXDs5x4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyu3PyJSQBjOjguNYh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZEKFaF2rZvSIRUZF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxkbbBer6DnqJUfo_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwni-8yjbvjREV2luJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy2dkMvZhjlMtODkGx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]