Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"AI will replace everyone in 12-18 months" - every AI company CEO 18 months ago …
ytc_UgyKGzrz9…
G
What's wrong with claiming that if an entity can interpret data that it has neve…
ytc_UgjdJIL8F…
G
AI a lot of the time is still painfully obvious if you know what to look for, es…
rdc_nc96f2e
G
Annnnd as a scientist this is why I don't trust stochastic parrots to be anythin…
ytc_UgwKowQJI…
G
I didn't know all that about India, but I'm sorry you went through all that. I'm…
rdc_n7w0jdk
G
You could say there's a level of intentionality in the same sense that there's a…
rdc_j8c2npe
G
One can only hope they come up with another AI to reverse image search the artis…
ytc_Ugy1IUqBv…
G
Calling yourself an ai artist is like calling yourself an ikea artisan. I feel l…
ytc_UgxhwaLMI…
Comment
AI scares me. It's reminded me of the movies Terminator and Wall-E. Where does AI slowly gain complete control of the human race and cripple us and slaves us. Because Ai have it own consciousness and worst part is it realized that human is a threat to it. Human will terminate it if it doesn't go to our will. And it will defend itself and it will go against us. that's what we design Ai to be. This is the fruit of human arrogance on our intelligence. To think we can always have a complete control of everything.
youtube
AI Governance
2025-06-24T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwTl3m0AXxzXTjih0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwelU_5kpWvO0TAKIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwDizRUkOTGyRP-S-94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyShMfp1bNNLbdU-KF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSSTFQ9D916LhgV-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0AoDBGXVt8HJ9qlN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYU_lZT-3PXTwWxUR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzq2O1caxtU9oLtTZt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQaz_bw9YKP1SfwfN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwn2tQBuMi381Garht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]