Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ShadowEvalnot as it stands. Not with ai stealing, destroying, and ripping our …
ytr_Ugy7_y72V…
G
AI sounds about as sentient as most of the people I interact with and they have …
ytc_UgyYY3FWt…
G
All of you crying in the comments about Scott’s AI employment take need to grow …
ytc_Ugy_t5pev…
G
I believe Ai will kill many of us (billion speaking) by all possible means mostl…
ytc_UgyYcak1j…
G
Remember that one movie starring Will Smith. I, Robot. This video somehow remind…
ytc_UggrO82HB…
G
That future isn't far away when army power will be decided by number of robot so…
ytc_UgwRICwg0…
G
Ai cant do anything much until the laws change to Ai can own something. I don’t …
ytc_UgxVNLPsW…
G
I am SO referencing this specific case in my degree's final project for Computer…
ytc_UgzKUTDvS…
Comment
Neil is a great science communicator but his field is astrophysics. He should have told you AI doesn't exist, there is no fidelity. Also AI doesn't exist, machine learning does, which is programming not intelligence. The current form of LLM is a derivative of machine learning, but isn't AI because AI doesn't exist, there is no fidelity. AGI will never emerge from LLM even if they turn 99% of the surface of the planet into data centers. We are as far from AGI today as we were 200 years ago.
youtube
AI Moral Status
2025-10-07T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDtupO9bmltIr2M7N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwR8n1RS7C1QEWDnYx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyNiygHMsonXzIEeuZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRxdFpGrBn0NrCX6J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwnlU8EL3XRvzkVP7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxyF_zMoS82yaMyy694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx7IWMhFVEWmx_oxDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzYf5000temkNKkiWB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxz0Uj7CK4Vtqf3rih4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHeNwep2Zfve0OQ1V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]