Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
On the subject of the social problems that would arise if AI replaced human inte…
ytc_UgwnClS7V…
G
Anyone who says AI art makes art more accessible needs to apologize to Frida Kah…
ytc_UgwgwF9U-…
G
If I call Apple customer care and if an AI asks me about my problem. I'm selling…
ytc_UgyYSCXl4…
G
When will AI helps make flying cars become reality ? Which is a dream of many o…
ytc_UgyZItnIz…
G
An Ai car kept driving with a woman being dragged to death under it because it d…
ytc_UgyMTzEYR…
G
There’s more harm caused from AI then sharing your boring drama but if you want …
ytr_UgzBPCILf…
G
Sadly I don't have the money even to buy a used car. I'd love to get an electric…
rdc_ogrzssc
G
Exactly, when you put it on the internet, people can share it, and so you defini…
ytc_UgzKIe4W0…
Comment
How are these AI's showing human tendencies? How would it know that there's a situation where one of the workers is having an affair? It sounds like they connected the AI to other computers--workers' emails, bare minimum. Why would they do that? And why would they program AI's to have human tendencies? How well has humanity handled our own tendencies? And you want AI to be LIKE us? AI can only do what you program it to be ABLE to do. Give AI human tendencies and then connect it to the internet instead of only a power cord? Gee, I wonder what might happen. AI isn't evil, it just shouldn't be made to be like us.
youtube
AI Governance
2025-05-27T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjCBEqmH-eSXnloQF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwlLMZyU3hmIQgL1St4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx_6KSJ4nK1WZKg2MJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwCxeGX6qBNMlN2tOR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwU1IrrBlP3XGuGUNN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfdSxT2vMW6LnILud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz0-q2EY5GFmD7PAZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxK5LbSGsIoli_F6vp4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyhyfRIpOiFh12kKvV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwOFswGLXwWLMhBYqx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]