Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The give away is no recoil from the weapon…the stance of the robot is too compac…
ytc_UgwOiqd6q…
G
This is clearly just a person who is incapable of driving using self driving as …
ytc_UgxXaDvM5…
G
Yup the “percent of code written by Claude” isn’t an important metric. We all kn…
rdc_o9vuvpi
G
One of today's most prolific philosophers having a conversation with an early ve…
ytc_Ugx-v1WxH…
G
At the end of the day programmers / engineers are problem solvers and that's exa…
ytc_UgyHpzGvB…
G
Let's not forget the shift to Arizona was due to California being too mean about…
rdc_dfesn5l
G
Me too! Yes, I admire Sal and his educational outreach success with Khan Academy…
ytr_UgzxTKZIS…
G
You say that now, but we really don't know the implications true AI will have…
rdc_kqt0jrv
Comment
This was a great video. However, I do wish you covered the question of if AI can even achieve consciousness. I believe AI will not ever be able to achieve this. The reason being is we understand consciousness in a human experience. We can decide to do something with no instruction to do said thing. Even if a quantum computer could simulate every neural pathway of the human mind and spit out an exact replication of human consciousness, it’s still not conscious. We understand consciousness in the human context, without the human, it’s a different category of consciousness. AI currently still required an instruction to do something. It has limits and thresholds that are tweaked by humans to get a desired output. Even on the quantum scale this is still true. I will say, if AI gets the point where it can do an action with no instruction from a human, that’d be utterly terrifying
youtube
AI Moral Status
2023-08-22T13:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpdRCAsrQyOriHhI14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9fHX4R_j6YWF75PJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyU_f_X2fK9q55xoyh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9TqwLwPEeoWbkAjB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOG-rFFJGd4EMhg7Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykR0cPkraInowh4BV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwM94IpKpsU7XZCTGF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXGa3jw-m13Le8NAV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwuJfHJvIORtKXlYrl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY5vAEaQTcxwQubll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]