Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The people who made nukes thought it was going to be the end of the world when p…
ytc_UgyrsrOfR…
G
This doesn't even touch how research is often forced to put in some buzz words i…
ytc_UgwkxJbEe…
G
AI are best for predicting unknown relation. for the known thing use convention…
ytc_Ugz_Ldsn_…
G
Because some of us ARE artists, we just use AI as an additional tool.
If you're…
ytc_UgyB2BIQg…
G
AI is like children. We put children into the world without even thinking, and t…
ytc_UgyQSYBQU…
G
I know how to do this and am doing it with ChatGPT, Claude and Kimi…
ytc_UgzinetzY…
G
people who defend ai art dont realize if artists stop posting and drawing/painti…
ytc_UgxnrYLpI…
G
Okay, but if the ai's can work better than people, the employees will just start…
ytc_UgzDzvvIc…
Comment
As long as it’s based on a statistic model (all existing AI’s), it won’t be conscious.
Just read one technical book where AI math is explained and stop phylosophing around it, you’re just making people misunderstand and fear it more.
youtube
AI Moral Status
2025-03-13T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx-g0JhG6kO3GTnRTF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzVsyMx2ubmfwRmPZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyOwIDeJ5o7etc5Czh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_Ugzap8Gn8h24MuCDHRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw9hZ11V7i1pSRz8ex4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUqbBJACD7902_hi14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzGaRixydj-yPm3W2t4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLZpRcgfvKUtEaEp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxczyGIN8rOQsIepaB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtIqZSDrF-TlrCbK94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]