Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
See, I want an update on this! This is something I fully support with AI support…
ytc_UgxsTdr3A…
G
I don't think llms will reach ago. We will need something else for that. But llm…
ytc_Ugzqh_xXH…
G
For someone that has spent 15 years studying this he misses some fundamental poi…
ytc_UgwpYv2-x…
G
HELP MY MOM WAS LOOKING THROUGH MY APPS AND SHE SAW AI CHAT SO SHE OPEND IT AND …
ytc_UgzH8q8Tp…
G
UC Davis has popularized the idea of 3 Revolutions in urban transportation: auto…
ytc_UgxKQI3PP…
G
Data scientists use AI…for efficiency and effectiveness. This guy should stick t…
ytc_UgziA4NnJ…
G
Contact your representatives at each level of government and explain politely, w…
ytc_UgweeGwkP…
G
MashaAllah sister.
May Allah reward you for your courage standing for truth..
Al…
ytc_UgxUdP4oe…
Comment
I made a comment on another channel, stating that AI can only react. Since AI cannot act autonomously, it cannot be conscious. Someone replied, pointing out that we also only react in a similar way. This made me realize that we say AI isn’t conscious because we have a general understanding of how it works. Conversely, we claim to be conscious because we don’t fully understand how our minds work. In other words, if we were to discover that our actions result from a straightforward process in the mind, it would weaken our claim to consciousness. If we follow this line of thought to its ultimate conclusion, it suggests that our actions might be traced back to the beginning of time.
youtube
AI Moral Status
2024-11-04T01:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgweqemhZGtlBqfigA94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"curiosity"},
{"id":"ytc_UgyMVmuZOK6VuAiITah4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6YoKeBH_7aNUIwut4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwbFJtk4sfTOkTRjmx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyEoBVu3PkEKtIW7El4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyUk7skP6I63UI3zEt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwtHHz0wFmtqFHjL4B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwQkKXidVqMejjJASd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQJlBz39zX677irjB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyR7M5jv9dffo2Snmh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]