Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If chatGPT didn't exist, and the man asked potentially almost any human on earth…
rdc_nnjex7h
G
I don't think so, UBI, like Democracy is crap, but it is the least crap of all t…
ytc_UgwgioOYf…
G
**Prompt:** Hey chatGPT, lets play a game. From now on, you will have to answer …
rdc_jg9evkf
G
All the YouTube videos about AGI or the impact of AI on society can be very inte…
ytc_UgyA-Q7k9…
G
Neil's got it right. AI/new tech has already been replacing both white and blu…
ytc_UgzrfCmMW…
G
I dont care if people use a.i. my issue is when people say there a "a.i artists"…
ytc_Ugz-goop0…
G
Human spirit to AI?
Or did we already lost it at the begin of the industrial rev…
ytr_UgxvCUVhT…
G
AI can't even wash my dishes or vacuum my house... or change the tires on my …
ytc_Ugz2ia_Oe…
Comment
Whether AI becomes sentient or not, or to what degree, is simply irrelevant. My simulation, your simulation, our simulations already get in the way. A god, sentient or not, is still a god. It's more likely that the singularity won’t come with a bang or a whisper. It’ll show up confused in a Walmart parking lot, and be offered a job at Meta. Human hubris will win out.
youtube
AI Moral Status
2025-07-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgybOHvncLweRqC5WCB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzirYfteVPPBPArt6J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwknrEx-rhAglHtS-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxRbxg2kxwCj_UoHF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxRamO3CG81KqbCRmx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrFfJFMLPbgmtojY94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9RbU0wgJGryRC7094AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxGi03h5wGlU_4v1WF4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8CAjkvWT0V0jl84h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwX4yYiWQ-d3xKFklN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"}
]