Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've been seeing almost all of these you mentioned are coming, but those AI idol…
ytc_UgzgqAYiR…
G
Consciousness comes from self-differentiating, so AI that is doing this has some…
ytc_Ugyfophhp…
G
Could it be that AI replaces a lot of jobs but then we stagnate because AI can't…
ytc_UgxBThB40…
G
i think your comparison would be better to skip the memories and instead the ai …
ytr_UgxNSpsc9…
G
I use AI to recreate a character I was writing.
After 45 minutes of detailed pr…
ytc_UgwNSTbG7…
G
@Andygb78 so humans won't want to make music anymore? AI music, so far, isn't …
ytr_Ugz2rgHl1…
G
"Jason M. Allen via Midjourney" is blatant disclosure. Why are you claiming he "…
ytc_UgxlBKVu4…
G
What do we do with all that time? We volunteer what we want of our time, for the…
ytc_UgzebN80q…
Comment
All these topics are important, but I'm getting really tired of this "speaking of existential risk from AI is a distraction from current problems" claim. Should we also not talk of the long term implications of climate change but instead focus on its current consequences only? It seems so utterly clear to me that both, the short term as well as long term dangers need to be taken into consideration, and claiming we should ignore one of them because it distracts form the other is incredibly short-sighted.
youtube
AI Responsibility
2023-11-06T13:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyF6IfSN3VbGBs3U3Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHqeUcFeWwY_BeopZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwMvzdj84njrG-ZJWp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw6j839QqQYsUOGkkV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxmiDdYRLsKvrPR9nd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyg82mTX_D-Hay4UlV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwkRT1zJf9ESSvQWqZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvAuyoxsti5Znfu994AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxrFUMp8QWZZT2cPqd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUCuVRmsEFXUkvsUx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]