Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I gave ChatGPT this prompt and now I feel like we're so doomed. Have fun....
…
ytc_UgyO9BGy1…
G
Not to offence but ai art it's like looking at bald head it's clean awesome, but…
ytc_UgwSg0aw-…
G
The problem with a question like this is you need a giant context to process all…
rdc_m2famdq
G
What about if I hire someone as a work for hire to make me an app. Wouldn't I o…
ytc_UgzCcbxap…
G
AI art is for lazy untalented people who won't take the time to make something b…
ytc_UgzUckJrX…
G
If nobody had money to buy things, then there is no point to making things with …
ytc_Ugxs0Tz5q…
G
Humans too are next token predictors, except the exact scope of the tokens/conce…
ytc_UgxwLPDTC…
G
The only thing A.I can't take away is creativity. As long as creativity persists…
ytc_UgwEmVtZA…
Comment
There's no reason to think there's a limit to how smart AI will become assuming we have the energy, materials, and computational science down. There's also no reason to think AI will be any threat to humanity in the coming century. Yes, many tech representatives are worried and many are not, but average people like everyone in this comments section are only afraid of AI because of sci-fi films and not because of actual computer science and machine learning.
youtube
AI Moral Status
2021-10-10T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw6VoP3vsK29_glx5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvzvTgKmEyG6b6In94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5xqGIF1HZ83PyxWd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVuytIPsYmpMGZX1h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzRe3a-zQnwcwDkfmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxilKuZT_4YYpmXklB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6zhjlxve7QUC61Vt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxu7mTB2PoT9Jmy99J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwO6SJtPpLpkktJX9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzf_t0x57uxlfY0yNp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]