Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The majority of us will be of no use, there will be no need for us anymore, we w…
ytc_UgwGP69yK…
G
It’s a very very good deepfake but u can still tell it’s fake especially around …
ytc_UgwhZbYOC…
G
I can tell my the red in your eyes... What's got you head tripping on the proppe…
ytc_UgxEmijg1…
G
Studies have shown that plants feel pain. Do we grant rights to plants now? Scie…
ytc_UgwDAjRlL…
G
Wait..doesnt Turing predate Hinton? And wasnt it Gibson that made AI out to be t…
ytc_Ugzpffw-0…
G
I’m all for capitalism. But this is industry ruining neighborhoods.
There are …
ytc_Ugzf-AA29…
G
The machine supplier explanation is utter bullshit. "Motion sensors" don't need …
rdc_ks7qnnc
G
Interesting until you mentioned dignity. “You work or you sit on your ass” is ba…
ytc_UgwjRuP-y…
Comment
Interesting podcast, but the author still needs to sell his book. If he said that we are 40 or 50 years away from even having a chance to do it, he would sell fewer books. Also, the history of LLMs did not start in 2017 as he “mentioned.” From my point of view, this is misinformation — the first articles about neural processing of language were published before 2000. Language processing is even older - 1950. Yes, transformers were crucial, but, this is not how long we are working on this. Whole development did not start 2016, it takes 20 years to get there (and we still omit 40 years of theoretical work before).
youtube
AI Moral Status
2025-10-31T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7UMuzISfDB2d4XLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPzpx6ketGiJ9xIex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5cyJPKe9K1Lh5gD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDqVJmqxoVbRWMdMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAh6axa2AuhhQoOKV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1xOQ1h23lJSEgmIp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw-oDUYiVr8AWRLXu94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAlDe4Sg1BZrH2x314AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwSAEAsiRw5R_WOT-x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyUhxYaeFCcZgD5sZl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}
]