Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is total BS. Nobody has doubted that AI can do basic tasks like programming…
ytc_UgwO8h5bO…
G
AI is becoming extraordinarily worrying because with its advancements, it gets s…
ytc_Ugz1FocUk…
G
Wouldn't this also be an excellent argument for Transhumanism? Isnt there a deep…
ytc_UgyWZaz5G…
G
I used the free version of GitHub copilot and the tab autocomplete feature is tr…
ytc_UgwwdmLLi…
G
This video is the same hype and cope the AI companies are pushing with a sliver …
ytc_UgxbnJSvB…
G
@theodorelindsay4160 how do you need facts to point out that doomer speculation…
ytr_UgwRM4hCk…
G
How does AI have a will to do anything? It doesn’t. The human will is missing …
ytc_UgxsErl44…
G
I sometimes forget that this sub has 3.5M members and most of them have done zer…
rdc_m9jhiub
Comment
I think ChatGPT did great in this and his immediately being able to save 28 children from malaria analogy is dumb because he’s only giving money. He’s not pressing a button. The money could all be spent in nefariously could fail to make it to a child could be spent and used in effectively so he doesn’t have a direct connection to saving “exactly” 28 children he might only save one he might save 0 he might save 1000. Simply giving money is not saving anyone.
youtube
2026-02-17T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw2J3k6a1iConRhq4t4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0TIDNvYCd3YPZDBt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxW7FVzKtv-oizYwYx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPcEfGPyQlozytM1N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyrabap4efdOmWx7l54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyu_0wfGnJ2fcQgPQl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYjaOuWMiqKXEeAG54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxb2uf3fClL3BGU2Ch4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxs_O7ZWl02H0M9nsJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXlv3L45oODIGht954AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]