Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is not the robot that will destroy humanity, it is the man who controls the …
ytc_UgyLPjJwT…
G
"Surveillance capitalism has 20 years old. Democracy is several years old. I'm b…
ytc_UgxlLHORt…
G
Why is hiring spelt wrong in the first one? It comes off like the ai cant even s…
ytc_UgyXLC2oz…
G
I knew it! When the first time I saw Clippit a.k.a Clippy in 1997, I can feel it…
ytc_UgwoajX0A…
G
@SanMork-y1tits a joke, the point is AI been improving so fast. AI video gener…
ytr_UgxfioCEo…
G
Great points but it misses an enormous point: the programmers designing the soft…
ytc_UgijP7n1A…
G
Dont worry the world gonna end before AI is going sentients. There is no word in…
ytc_UgztJERQ1…
G
guys I dont know if further into the video this would be mentioned, but there is…
ytc_Ugw72Z-aK…
Comment
AI and KI have neither consciousness nor intelligence. It can only do what the library provides and how it was programmed. And since it has no consciousness of its own and can never have one, because intelligence is not based on a material or physical level but on the power of an intelligence of the universe, it is only dangerous when it is in the hands of humans! And that is what nuclear energy is, too.
youtube
AI Governance
2025-09-19T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy6cuC4wi6VV_5SYq94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9UM98ls06sgyS6SV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxGaYTQ6u9l1juKaLB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0K86TTZ2ixZr76Uh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy7lWJljPWz9uGfrEt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwqMmiTyyWePuJbIqd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHskliR9GX2oBnAHN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwIfaeCQ8y96tV33F14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOStCGutgvHaDzec14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJGjzauzhmpEHp68F4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"})