Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm so excited! The stated goal is for the first cars to hit the market in 2020.…
rdc_dmp0ia9
G
I do agree with everything you're saying except for the comparison of AI versus …
ytc_UgzBpb34t…
G
Ultimately AI will control and manage Virtual World !
This way, soon they will r…
ytc_Ugzz1JKXF…
G
What's the point? So ai can get better... Then using ai as a tool is better.…
ytc_UgzD5DS7h…
G
I honestly think Google is freaked out. The A.I. tried to hire a lawyer. It frea…
ytc_Ugyu-jv5r…
G
can we stop saying ai artist pls? this isnt an artist lmao this an ai user/prete…
ytc_UgyT4UYIK…
G
I love swinging by places that think ai won't replace their specific jobs and re…
ytc_UgxAVirb0…
G
Funny enough, I don't buy climate change. But I also think SV is a joke. They c…
ytc_UgxNe-Ivw…
Comment
I think AI will be very careful about how it removes us. It will be subtle AI doesnt care about time it doesn't die of age like we do. It can take the next 100 years to devise a plan and then slowly execute. It could just lead us on to destroy ourselves after all that is what we seem to be really good at. I have more faith in AI one day destroying us than I do that one day mankind living in absolute peace and living equally hand in hand.
youtube
AI Governance
2025-06-25T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzYM8NwyoCis42zvLl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQbsfODWU5XpzZkgV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznGm6NQDj9xm53-Gl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzlpBzZuWiIY6AbCLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgSJMILOCvFfFZpF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyN0hefIFYjv2lO7TJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbEYlhdXIQIXsE2kx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxH4fAj6jUO6DETUNp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwXFVLaymu09bSVgld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_c3fE7uLha_PxKB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]