Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I vote we get rid of AI all together. The world worked without it just fine, why…
ytc_UgwIYOZJJ…
G
I think Gen X and Millennials are going to be affected the most by AI, baby boom…
ytc_Ugx0pLJBP…
G
I wish all the ai artists just did the funny gordon ramsay videos were you just …
ytc_UgxCQO9YM…
G
Just tried to have a chat bot write up a statement of work. I spent more time fi…
rdc_n5ifns9
G
Why can’t AI have rights too? If an Alien lands on earth, will they have rights?…
ytc_Ugzmu9hD7…
G
Almost completely agree with all that you said.
But for the jobs, I prefer to st…
ytr_Ugx6odhIU…
G
If they gave Sophia the robot human rights then why did they take her apart and …
ytc_Ugwk-zXZL…
G
for me, I accept AI pictures as art if the promt will always gives the same pict…
ytc_UgxLUCpvR…
Comment
Another outcome I can imagine is humanity becoming something akin to crows living in large cities -- crows can make a living in such environments, they may even prefer them, but crows, clever though they are, have _no idea_ how anything around them works, how it was built, _that_ it was built, nor do they have any idea what the hell humans are doing going about like they do; they don't even know that they don't know anything about the human civilization economy, science and engineering, etc.
We may end up in an absolutely baffling world that makes extremely little sense to us, that we can nevertheless get by in. And we may only occasionally be "managed" by the AI civilization when and where we are a nuisance.
I guess there are worse possible outcomes.
youtube
AI Governance
2023-06-28T07:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3xw4e8ocKyU_ZQVB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvFU5BKq0WWt53Omp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyKKv9bE8upTa5Sbyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyVKsTelwk9yglYzrV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPb-0iY4pi7iqejet4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHCTk_4zxgU8wyWEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzzypf_v-asdoe7_Nh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIX-TJyadxzvqSmI94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxfe8sR2whVY9uI4Jd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwETgxNS3mPOdvrjtV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]