Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They fear an unregulated expert. Something to help the common people with all so…
rdc_je7gth4
G
2:12 “...won't that confuse people about what the truth is?...”
What about the 2…
ytc_Ugxw1s9ln…
G
AI artists do not exist. Writing a few words to generate a picture isnt art and …
ytc_UgykzKeRy…
G
I love AI attitude, but to be honest I'm hoping everybody quits all their coding…
ytc_Ugx7UlQ8z…
G
Do you not understand the amount of behavioral issues and mental health issues t…
ytc_UgyyKbRrY…
G
These positions sound amazing, HOWEVER… this is thinking AI stays as is and does…
ytc_Ugy7BUVgd…
G
Plus, AI art looks so generic with the same artstyle that's so easy to spot. It …
ytc_Ugxoi0_c4…
G
Let me get this straight. Cunt creates a tool that could possibly destroy humani…
ytc_Ugxr60BGU…
Comment
I knew about a transition that we would go through by 2050 since I was about 10 years old. I believe AI is a huge part of it, but I think there's a lot more to it as well that we aren't seeing. By 2050 whatever happens will decide the fate of our civilization honestly. We'll either succeed or destroy ourselves completely. I don't know why I know this either, but after these events happen, it will be 1200 years before society will be ready to pick up where it left off again. Something about the planet needing to revitalize and people will be able to resurface and rebuild. But off of a new foundation where we do things differently. Almost like we've learned from our mistakes.
youtube
AI Governance
2025-09-19T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzNKCGgsUCY51eRsbt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx15SOaJvIjLUWOJOB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzfi10p7mYjvJAgegl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgznP96dr_N2iwAruAh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzVAvoueVFp2jqBwR94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyrMN2gN5vDdwvNRy94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiFAW5gaUDeKZzTRJ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-afaZbqd9Bn25eUR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz2eRS0ViH2mGIl_IZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPxkEv5kRsex_434l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"}
]