Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, don't we also have stealth autonomous bombers? And doesn't the Air Force c…
rdc_ic1wczg
G
My question is, where does this knowledge come from? All knowledge either comes …
ytc_UgxjchseQ…
G
A common assumption is that AI will replace all human jobs. But there’s a struct…
ytc_Ugz05AAQn…
G
Je remercie les équipes de Cash pour ce travail exceptionnel ! Je reste bouche b…
ytc_Ugxiwib13…
G
The people who post stuff like "why are people scared of AI?" need to read this.…
rdc_kt63wzx
G
Every lunch should be an hour period, for any company paid i do not care the wor…
ytc_Ugx-KWppB…
G
Recently I have been using AI to help me write code for my web development side …
ytc_UgzSuusvp…
G
isnt this kinda ironic?
clima change could keep us from suffocating a few years
…
rdc_e441cbc
Comment
This question always makes me laugh, because it exposes our unexamined impulse to create more 'natural' intelligence. On what moral grounds do we have children? We do so out of biological instinct, but there are a lot of biological instincts we could do without.
Creating an 'artificial' intelligence is no less ethically sound than raising a child. To the extent that an AI can better convey and implement the ideals which we hold important, building them and teaching them is morally necessary.
youtube
2013-06-19T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxMK_YrR0lCifidYXx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx67MCGdc56jhscCTd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy4UcZOdY2nbqzPM4F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOInKBab-AuGs-OCd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTDiv1Q0zIU7ldrX14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxHNs7RV24Qz9AN1A54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUZ8FkYGRlBV6sVth4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyoy5OB4rqtWYKoKRl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpUkME3q97u6HJ7Ch4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx51Nx0a-p2V09laiZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]