Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Id challenge him to find a modern chat bot that would encourage suicide - I thin…
ytc_UgyQuwnKl…
G
AI is just manmade programs. There’s not going to be self aware AI. They’ve been…
ytc_Ugxr6kxo4…
G
Why do these people think that the goal of AI is to think like a human?
Oh... Be…
ytc_UgyR6QG6L…
G
I don't even trust automatic transmissions. If it ain't got three pedals, I move…
ytc_UgwSfFIzI…
G
Maybe we can live in a where humans don't have to work enough because AI and rob…
ytr_Ugx7VS23a…
G
This is making me remember the movies I Robot and Terminator. If these robots ha…
ytc_UgzChKimC…
G
AI still not able to combine even two word in realization of one picture or …
ytc_Ugx3rwKWN…
G
the way to make AI have empathy for us, is for us to have empathy for all beings…
ytc_UgziHEahv…
Comment
"bad science could drown out good content." good CONTENT?! i don't think concerns for the "content" as in "good content creators will suffer financial consequences" is the worst, very, VERY soon it will become totally impossible to check any new facts! imagine all the implications, using deep fake as well. the world is about to burn! we are so fucked! it's hilarious to think people were freaking out because a.i. was gonna decide we were bad and would want to get rid of us, and that that was how we were gonna go down...🤣
youtube
2023-10-15T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugze__MmwN_up210OEl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgysifOaYPvUjsmzg4R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1lGfiVK3rgOI1ujF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzhuas-sleDK8o6eUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcxLyXVtlvllgtKhd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRRPo5CIG5Kq7_NOV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxU6r1i9-djygFVi7p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyKKm79DpTAJUmmnQN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1GzkueLsxHBktZwt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyvnXVmybHkt1Oi00V4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]