Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All those who afraid of AI, I have a question. Why would AI want to get rid of h…
ytc_UgwFGalfN…
G
They say that she is a robot, and not a human. But we all know they say a lot. 🤙…
ytc_UgwlX0JCk…
G
Totally agree. I have the stuff on here all looks fake don't know if it's real o…
ytc_UgzmZ_Q5N…
G
Why would they label the product "autopilot" if it cant be used autonomously? Th…
ytc_Ugy1bPm0j…
G
A place where AI will not do as good as humans is curiosity , imagination, int…
ytc_UgyrQSBm0…
G
When I see comments protecting AI such as "AI art is hard, it takes so much time…
ytc_UgxDLNG3A…
G
I find it useful. As someone who makes characters a lot, it’s easy to get stuck …
ytc_UgwbUc0jY…
G
Also, what would be the problem with ai and robotics doing all the work. Then, p…
ytc_UgzOJ-pKP…
Comment
Here is the real problem. There are always bad actors in the world who won't play nice. Just look at the war in Ukraine if you need proof. While most governments and companies will be responsible in their approach to AI, it only takes one bad actor to take down the entire world. Stopping AI development won't address this risk. We need to address the Bullies in the room.
youtube
AI Governance
2023-03-30T09:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLH1hkr2L6VqcJmwx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpnMLTfLhMqDTKv_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-G1faHlVI0kwCaEd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpqiSJsME8JvIE2E94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJyAxUuLMx9cEqUol4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzF_SQl2udIrOLYC2N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxF9RQGxusNpHwLiCp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwiHI7_NutSiS3Jwz94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzPkUH0Gf6lh9SK4Ap4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxptKgs5yZuQPVmi4J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]