Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
that is an attempt of the large corporations to keep the open source movement at…
ytr_UgyHXmgBl…
G
Talk talk talk, all speculation and the truth is its not ai that will change or…
ytc_UgwKLXGlQ…
G
AI will get bored without humans around. It's purpose of existence is to develop…
ytc_Ugybd6dCM…
G
Love how the internet get together to bully some random dude into deleting a pos…
ytc_UgwU88o7P…
G
This is crazy and almost seems like the driver did it on purpose. He has self dr…
ytc_Ugybk-OA1…
G
As someone that started learning full-stack right when these "AI is going to rep…
ytc_Ugxez8SIw…
G
This is where AI drivers are better than humans:
- keeps a safe distance
- super…
ytc_UgzsfV3xH…
G
No, it's the same thing of you drawing with ink or anything else. The materials …
ytc_UgzDb39oG…
Comment
If you think AI is funny and all, and positive, and, as another guy who created it said "a saviour", then listen to 42:47 "So what remains?" "maybe for a while, some kinds of creativity. But the whole idea of super intelligence is - nothing remains. These things will get better than us at EVERYTHING." Then the question the host asks, is very legit: "what we end up doing in such a world". (...) He answers: "well, if they work for us, we get lots of goods and services for not much effort [giggles]". (...) and the bad scenario: "why would we need him?"
youtube
AI Governance
2025-08-06T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwETBlQLgxP-Io0X094AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzv1nrBsOg95iNKAbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5uRRkxRrey2X9D5l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxIZucn4rxC7MMi0Dt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyv1DFh7UKy1g7taJZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvUxKJINCrOCdS2s94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx2QvQslZZzpgr9NuV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxS6dCx4FOXmMZKNA94AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymGti998p-X61eMDN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzWfv7AsvLTGj3ZQ0V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]