Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have to disagree with you Charlie a wet dream is created from your own subcons…
ytc_Ugw218e2n…
G
You say that like it's fact, but AI can and will get to that point…
ytc_UgwYkGbgB…
G
My job is safe. I’m cybersecurity and pivoting the AI safety which is a bonus.…
ytc_Ugwcz171O…
G
Why is AI starting out with all of the negative human traits if it's going to be…
ytc_UgxXHEFAm…
G
And I looked and behold a 4th beast diverse from the 1st three beasts. This be…
ytc_UgwF_q3jb…
G
God, I wish I could have seen the look on the human passengers faces as they wer…
ytc_UgwX-Qugt…
G
Hi Sam, thank you for this amazing video. To be honest I started watching this v…
ytc_UgzPqaaXa…
G
Breaking news. Every corp ai use your chats as training data. Another question i…
ytc_UgyNa-59e…
Comment
In 1942 this was written: (1) A robot may not injure a human being or, through inaction, allow a human being to come to harm; (2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law; and (3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Governance
2025-06-25T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxNUNxBYPhyFpCQr-t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgxkZpi23Q3JFrtr3Jl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwrTg97txcJgAfQLg14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugx3HfLpKtm6M28uXch4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},{"id":"ytc_Ugzaa9Bi6GI0yGYasAB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_UgzpZIfIL_8vkPyitdN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzCfJuWzElsG0ecd2B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwCo7xLRppFcC82UPF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy8Tv2YRguHN-kMdrt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw9jAAdVwbYLaGgzUh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}]