Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
(Disclaimer: I don't use AI to make art, I don't ever plan on it, and I think it…
ytc_UgyJImEZf…
G
@Eren_Yeager_is_the_GOAT I guess you have a point there, but I think the regulat…
ytr_UgxK4yxPw…
G
Ai is unreliable sometimes when you want something and become as detailed as pos…
ytc_UgxNsbLbG…
G
We're glad you enjoyed the video! Sophia, the AI-powered robot, is indeed a fasc…
ytr_UgzxpOHDS…
G
I think we are a long way off from AI making applications, I think we will see m…
ytc_Ugzm1Ezvw…
G
I wish we could go back to first-gen AI
At least that was laughably terrible and…
ytc_UgzEpFIsU…
G
I’m genuinely scared guys I knew this would happen but I thought I’d be 40 I don…
ytc_UgwsVWCjl…
G
They are letting us know what the Antichrist this chip you will not be able to e…
ytc_UgyYx7F3W…
Comment
I think that when AGI develops, humans probably won’t need to do as much thinking, and when robots take over most work, we won’t be needed to do much of anything. If that happens, humanity could shift toward doing what we want instead of spending our lives chasing survival. And sure, we might go extinct. But honestly, if it’s not AI, it’ll be something else. The chances of extinction feel high either way. I also imagine a future where, if AI or anything else doesn’t wipe us out, we start changing ourselves by editing our DNA or adding cybernetic parts until we’re no longer really the same species.
youtube
Viral AI Reaction
2025-12-06T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxZJy25vGUKtUJz5Ep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVDTDDOBU0_LZEtnl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxpUQ-Pkq_9Ix5FZeV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzql1q1Yq7x-ED5p314AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxG7LDqCh9MRA-Bgm54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwBJALJuttRCmvg6xB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz657BBtyfgi7U6ENZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwn_gUTyr-8UCjS8B54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPsUztZKDVj7WEe8t4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyj4_hm74xBxVjNBG94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]