Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@TechnoMageCreatortruth ppl will be kinda of dismantled, like throw who u r out …
ytr_UgzAXihLD…
G
I really want to see what happens when AI learns from AI, especially at this ear…
ytc_UgyelKgEb…
G
I don't believe it for a moment. First of all, not all of the cars on the road w…
ytc_UgyUAuO4y…
G
Call me old fashioned, but when I learned computing, Computers controlled what t…
ytc_UgyKrcUr-…
G
I dont get people are upset when AI cant recognize their face. I'd be thrilled t…
ytc_Ugx_nv2HU…
G
There is one failure to AI. As of the current, it cannot initiate a conversation…
ytc_Ugw7s9Xbc…
G
You beat me to it! But this a troubling question. Biological organisms are genet…
rdc_cthny1g
G
amazinggrapes3045 not a fair comparison. There are various types of arts etc. If…
ytr_UgycallzU…
Comment
AI still cannot fathom the distances of pain and pleasure. Unless and until AI is capable of sharing genetic code. Which in turn translates to survival instinct. AI is far less than a fraction of human capabilities and may require 4-5K years to achieve the conscious and emotional levels of today's humans. But by that time, humans will be planes and levels far beyond the need or desire to employ AI.
youtube
AI Moral Status
2025-01-05T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzPpvrsdu_XNP-HQDt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzh6dgzStQiJoYiP_h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugzm9P9ZxxDAikKijjx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyS6WZDZRPyrzLPdf94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNd6jh-ZgfIZvAI7t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKw-kE_HpzpAaJjAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwxx6GT0N5HIUAMITN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzPqBhMdT9cu3sOZ-h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNQRPEnvsWGdKJlv94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyRLtsxDeSGjv9vhVh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]