Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
his first statement is already wrong ("give it more compute it'll just become sm…
ytc_Ugyy6coJE…
G
It's not the AI fault the a I said no don't do it. Where are the parents. The ki…
ytc_Ugx4ififT…
G
Trust me after humans read their 100th fiction piece, or seen their 100th painti…
ytr_UgzEOQT-D…
G
It looks like we are going to need our own Butlerian Jihad. To anyone reading t…
ytc_Ugw_DFscY…
G
Kaku so are you not on planet earth right now cause Ai is here and it doesn’t ne…
ytc_UgxNWLTEA…
G
you know that chatgpt is essentially an echo chamber, and will keep frameworks t…
ytc_UgxMmc3x6…
G
Neil, you underestimate AI. You truly do. It will replace a billion jobs. Atleas…
ytc_UgyZfzQJH…
G
COMPASSION, EMPATHY, SYMPATHY, RELATING TO SOMEONE. That's what humans do that A…
ytc_UgwmeipG_…
Comment
But I think it's also important not to villainize all those who create AIs by default. Not all of them do this with bad intentions. Around a year ago I made my own AI image generator, but I did it just as a passion project, I haven't even shared any images that came out of it. It actually feels so cool to make something that can generate something out of just some text. I'm not saying that AI image generation is a good thing and it should be more widespread or anything like that, I became sick of it too, but I don't think that all people who create AI generators have intentions on plagerizing anyone's work. To a lot of people working on, improving their models it is just what brings them joy in itself.
youtube
Viral AI Reaction
2024-11-03T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwf_2vlPsgcQxCX0aR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyk1k0xNAuuKYeBx254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgwLXsQPvkjZf7VNcFF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBYmgPw1dKB-jfJjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfaYYf3M7U0mwEHvJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm96A27ELDpf0niXJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx5iUsbwgZYjwPsqwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgxTX8al1mDJ9FfZ3xN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzKRJyWHNFjxuX6NyN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyu0-om2NfATKpbl1R4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]