Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ChatGPT has added programming to remind us it's not alive. Try talking to a LLM …
ytc_UgzkbxNCN…
G
AI sucks ass, can't do the work of humans. It's a tool, a very expensive tool th…
ytc_UgxxoifgO…
G
Essays aren’t hard. Just go to class due to bare minimum and pay attention. You’…
ytc_UgySC0yTA…
G
I think it is easier to believe that artists were just born with a special talen…
ytc_UgzEFo0nd…
G
Horrible people... the government is never your friend and to see people see thi…
ytc_UgyNb2eac…
G
I don’t think that programmers would be replaced anytime soon, considering that …
ytc_UgxvqxiBk…
G
why dont these ai sites just make a filter for real names so you cant put in som…
ytc_UgxjBZFuB…
G
When they say that AI will take over the world they mean it like how cellphones …
ytc_UgwVOhHGi…
Comment
you are worth 29 million dollars for asking questions ??? no wonder you are so comfortable dear Stephen.there are people through the world who dont have enough food or shelter. maybe you could donate some money to those intead of constantly asking if AI is dangerous . Simple answer - when narcissistic atheists are in charge - YES !!!!
youtube
AI Governance
2025-09-05T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx0ZsdHPDXJxIShiWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvjbMxd81f2M3MUSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxCGOJjkng6dDbDht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWWrDfErp2Nd9WpQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBGjTJIBYttPQ_KfV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlD30wXrc8WzIbVmZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5zftANSmRpq7xXyd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxy5fOl0byMBtz004t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJfySQEIn2bf8nK1l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8r0Xx0xO8tVvq9B94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]