Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was talking to a professor at the AI of Chicago a few years ago and they told …
ytc_Ugyzh67fK…
G
We have not developed the ethical maturity to work with systems as potentially p…
ytc_Ugw7yokD3…
G
US hours are not that bad, 40 hours. ~10% more than Germany. A much bigger probl…
rdc_dv14nqg
G
the AI “artists” also like to argue (or at least they have used this argument wi…
ytc_UgyIxqOBk…
G
The defense against AI being a quality issue never sat right to me as an artist.…
ytc_UgxWSNDVh…
G
@babybatbailey03 Real art that was fanart. Seriously just think for a second wha…
ytr_UgxsN8Bfo…
G
I disagree. the AI is getting inspiration and learning the same way a human is.…
ytc_UgzZloaNd…
G
Nah, it still won't care.
But I still say thank you to AI because I just have go…
ytc_Ugw4Xlcl2…
Comment
One compelling reason I have for AI not destroying us all is self-preservation. I don't mean like us fighting machines. I mean that humans can demonstrably create AI. After we have done so, AI can progress to HFI (highest form of intelligence). But AI cannot necessarily create humans because human motivation may never be artificially replicable. HFI now or 1,000 years from now will be essentially identical, so with humans interspersed around the Universe, given our drive to create technology (even in the face of our own extinction) that gives HFI the most probable chance of being recreated again even if a local phenomenon destroys one HFI. However, HFI might be able to create enough redundancies on its own.
youtube
Cross-Cultural
2025-09-30T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwZlF52gr-_7EOf4sJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzoiNXrForBJyhBvyp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZas8P2xTp2QhVcuV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7PvxfqLQdEc25Tsd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqWGhOjMyf-fbCe2t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjR3BaWxlRPI8aVKF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQ8RBfBt10Xc7n4Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqjUDISaPdQntaLWB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzMh8pjIGq_gbIEJY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzd6If0YW95OiXzeVt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]