Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This didn’t age well. It turns out all the laid off software engineers are neede…
ytc_UgyqP0lLb…
G
As of... now, EU is the best regulator in the entire world. Balancing freedom, f…
ytc_UgwD5mNWJ…
G
The scary thing is are they going to stop at art? What about other jobs or hobbi…
ytc_UgyPWFUze…
G
When AI was seen as a threat, this channel said everyone would be replaced.
Now …
ytc_UgxP3iGqf…
G
What if you make a video where you draw art that looks like a poor quality ai im…
ytc_UgyO6T3R4…
G
dont care, ai is cheaper and a lot faster even tho is a little erratical, with g…
ytc_Ugwnm91X9…
G
the type of stuff that future med students aren't able to tell because they also…
ytc_Ugyv_AIk3…
G
You realize AI only knows what humans program it to know. AI cannot think for it…
ytc_UgwGugrLw…
Comment
... a good introduction to AI. Yet, the comments below are almost all super positive about this video. How does the viewer know that such comments are not AI generated? My single biggest fear of AI is that it has no internal ethics. The only ethics would come from the humans who program these machines. It's up to human developers to program AI with ethical choice. Someone once said that human morality has not kept up with technology. And that is the risk we take ... when AI is trained and used for unethical tasks.
youtube
AI Governance
2025-08-25T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgypxIU27SLX5JOp1Kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz6XR9kqwXC6zPdKeh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMbCRV_WAa6gWrUoR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-nUE9gwpA18QSzOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzl_LzTwRHtUmEAc9x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzSu78Q9yQxk62dIvV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgySPbvPSIiHRscrrNt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjC_O-kEYow8wM3I14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdS5l2p5bdGbPS7o54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyGYraFrGHpVZytzad4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]