Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just stop using technology. The question is, why can't you? Because you are depe…
ytc_Ugw5F3QBu…
G
Let's not forget that the companies making these massive AI investments also own…
ytc_UgxgCW1h7…
G
What I fear the most is this will not affect those who are intelligent. But the …
ytc_UgwxCEQnj…
G
I feel like pretending, alignment, conscious, intent and all these other idea's …
ytc_UgyVUkr6Z…
G
I literally got and ad promoting the use of ai at the end of the video, aaahhhg…
ytc_Ugx6FEIKY…
G
„We are at a turning point where we can overcome silo thinking in AI development…
ytc_UgzZr-Ipa…
G
People who think AI art is their original art are either lazy and lying, or lazy…
ytc_Ugyvy3C1q…
G
Too late. AI has all the art up to this point and can create new things without …
ytr_Ugwyt7pxV…
Comment
Stop looking for the Boogieman in everything, A.I. can not kill unless their creators MAKE them killing bots! Meaning if you create Robots for example to fight wars for you, THEN you have a killer Robot, if they are never created why would they attack? It is like saying your Fridge is out to get you, or your Toaster is a sharp shooter and your TV will come running after you and strangle you! If no killer Robot is built then no Robot will kill you! So stop the ones that wants to create such a thing of abomination! If you knew before Nuclear Weapons that it was going to be created, would you stop it?! But let's face it they already have killer drones and technology... all in the name of "protection"... protection my foot I say!
youtube
AI Governance
2025-10-20T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgweWb8V6ooN4y26sY94AaABAg.AOUjEH68-gwAOWL9fLupj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxfky1qm7lohqVZ0-l4AaABAg.AOUNhKLXfMHAOg2yGhTqtv","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgwHMluy0kJn5blLn594AaABAg.AOTbghk7cYHAOhWGXvZGCO","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy_douuXaThiwuE12t4AaABAg.AOT84Y_skOpAOUHh2jY8bc","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzfAJdivTacc1Bv8hJ4AaABAg.AOSngdo6THGAOSpBzrPLAf","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzLDhXF2__Ga3la0-R4AaABAg.AOSVutMQEQHAOTMQFc55r5","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxK-3vCrjGW3OF-uGh4AaABAg.AOSJAeB8fDTAOSJsmxwKJe","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzEeQHIuuEBPWkhNWp4AaABAg.AOSE63l0xAIAOSIZ3qnkF5","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzEeQHIuuEBPWkhNWp4AaABAg.AOSE63l0xAIAOSJGxJPPd3","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw0CdOPYqMVe6_xjZ54AaABAg.AORz_gMu4AiAOS2mV0toh","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]