Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People need to stop accepting these walking robots. They only trying to get peop…
ytc_Ugy9UGOmR…
G
We appreciate your perspective. If you're interested in exploring advanced AI mo…
ytr_UgyimyQkr…
G
I fully believe it’s not about age. It’s about accumulating money. You used to g…
rdc_o9rqdxg
G
At this point we cannot say with a certainty what AGI is or consists of so it is…
ytc_UgwaSEScS…
G
What a bunch of horseshit. This current, 4th in recent decades, AI hype cycle is…
ytc_UgzhQesH3…
G
Omg this is awesome!! Learning shit we actually can use in the real life!!! Yay …
ytc_UgzhrCfyt…
G
I love that in the fictional scenarios they made the AI kills executives. Maybe …
ytc_UgywuSsk8…
G
Manufacturing jobs turned into customer service jobs due to outsourcing and robo…
ytc_Ugy-q6EWA…
Comment
Coloussus: The Forben Project, Demon Seed, Battlestar Galactica: Caprica, Blood and Chrome. Isaac Asimov's First Law of Robotics states: "A robot may not injure a human being or, through inaction, allow a human being to come to harm." Let's hope someone applies this to AI.
youtube
AI Governance
2025-12-29T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyrMkvBqhNlKrYJt2p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyiXeVhaXiXhKVEyn54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzeM1kxR_m_ePuRgbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzG_Dmavffk1zgYRJR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx1m_XzS5UW8DcWeb14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgymX1PqG3vFhtl9b3x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzQkrkzoL0zwJIob694AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx-dtRB-5Pj-9rj93V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9g5d4KZ1-h0IOFN14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyloTJ_ly2LXBcvLHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]