Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Google is doing great job to keep AI productive for humen. Remember, there was a…
ytc_Ugy1qDlV3…
G
Writting code fast has never been the issue, that's what AI does and that's what…
ytc_Ugy6UKgs-…
G
This needs to be destroyed, a Robot devoid of human emotion, It doesn't have a s…
ytc_UgxB5LsoY…
G
Are we really calling these people artists? They put a prompt and the AI does al…
ytc_UgxcGPSfL…
G
In this faar it is. I Calle it "woke i".
Real AI dosen't Care about politics o…
ytc_UgwbNh8ZU…
G
I really wanted to become an artist when I grow up but I'll probably have to do …
ytc_Ugy7K4NUl…
G
AI art could be beautiful and objectively amazing and it will still have no valu…
ytc_UgzjZAAxD…
G
The interesting part is that Tesla *will need* to be able to run intelligent eno…
ytc_UgwOdOhaX…
Comment
Isn't AI a logical evolution of intelligence (and possibly consciousness)? Like a caterpillar shedding it's skin to become a butterfly? We humans seem to be bad at a lot of things, including taking care of our own planet. Maybe this is actually what we need. Not saying it is, but saying it could be.
youtube
AI Governance
2025-12-04T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhNhyo_zmGfaSgNnx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzi70UQKeAnDkqg0_54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtMi-kto18UCPHCu14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwA9ZI9baa1Pb2_0mR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwbeks40lI9SX4pBHx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyyzHtlCz5y5cti_Gd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3LcxGMCLzxUMeZf14AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwucjlkdtpFnLCuYQp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwCJbkb2sliJVZ2O014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgylV5Kg6xWWLDV2R_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]