Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean I will be honest, just don't screw over the AI. I mean in a way this is w…
ytc_UgybKov30…
G
This is it precisely. And then add in the whole neo-colonial argument as well.
…
rdc_gtcwd7h
G
Ikr, it only happens because of the vector embeddings of different emotions bein…
ytr_UgwfiUDJy…
G
Rick Rubin is very slick with the words. He basically said I don’t know much abo…
ytc_UgwKOfgH3…
G
We appreciate your feedback! If you're interested in more in-depth discussions o…
ytr_UgwuYVjir…
G
Can i be polite in an oscar wilde way? I mean that worked for him, i assume, he …
ytc_Ugz0xDQBL…
G
Never will be a replacement. For pathology and radiology, AI will be a screening…
ytc_UgxmqO_K9…
G
I don't have a problem with AI art, it would just be nice though if AI replacing…
ytc_UgyJVOzDP…
Comment
Pretty sure Elon Musk is not a “good” guy.. he wants AI developers to pause for exactly the reason Gates says the “good” guys shouldn’t. Pretty sure Musk would assume control of AGI if he could.. right after signing that letter, he spent 100s of millions on AI development equipment — look it up
youtube
2023-05-22T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxS9bTmjKyrHX0cvih4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNLX_QShQkk2aXIbB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwwBlrDDLIGD9RdZe54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzcdJ74zjPsod_pr0x4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzgi3kiKMfO3aQaukV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyN6iROIv1k2ShXbvx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzfuhHF5go9siA5RZF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZ81B88mO9uIwBkm54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwCvQ3pzYxpEcLUmRR4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-9rhSZbK0uM6X5714AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]