Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
New technology has issues and takes time to implement well.
Even new PC games …
ytc_Ugxd1ZqLF…
G
With all the AI generated misinformation and social media posts, especially with…
ytc_UgxzdXAvB…
G
Yawn, who cares? Tired of seeing all this OpenAI drama and don't give a shit ab…
rdc_l69etzu
G
I honestly felt the rage of that person who wrote “no more art for anyone, f you…
ytc_UgzR9CcGN…
G
The real controversy is that why it makes humans have weird fingers? Is that how…
ytc_UgzWLhjg0…
G
i would say crypto is over, because there isnt enough GPUs and DRAMs to go aroun…
ytc_Ugy61LtYd…
G
I think you grossly overgeneralize, assuming that if somebody use AI for one thi…
ytc_UgzosPI4q…
G
15:30 Interesting point: "We can adapt". Not even looking at disabled artists, o…
ytc_UgxE3vg05…
Comment
Humans as a species dominated Earth in only about 10,000 years or so give or take that's how long we've been on Earth. It would take AI half of that time or less to dominate Earth so roughly 3,000 years or less. With all the tech we have now it won't matter if AI uses it or not once AI becomes a powerful entity it will start thinking by itself and could theoretically enslave humans.
youtube
AI Governance
2025-03-23T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy_1MGaubh9sRtEwlN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymDFCAjCD7FUjcrOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwv90F2jqQETNLjA954AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz2gE07eKGGV4aeo3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFAGucjKqMNUQfN1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZwr_JPMmDX9Ab_CZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEf5u95qGZuAsAdFR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzY4RGoUM51xw8JIOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwEJLIyJO8wrM4llFF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz2X3RDeGOkDB4W2GV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]