Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon is advocating for an Open source AI that benefits all humanity. Sam is advo…
ytc_UgxSK8MS3…
G
0:43 So "AI art" began all the way back in 2018?
I think back then, it would ha…
ytc_UgwUjenE9…
G
As a 15 year IT cybersecurity engineer, watching Elon Musk talk about AI “termin…
ytc_UgzUF4BKH…
G
I think it's a good measure to assume that any "hallucinations" are the AI probi…
ytc_UgxxbVen4…
G
The issue is probably having a big enough and varied enough dataset to train the…
rdc_f1e9j0f
G
Do AI actually think for itself without being told what to think or analyze?
for…
ytc_UgyZnX5b2…
G
It’s not necessarily about the morals of AI, but rather the morals of those who …
ytr_UgxtY-kQP…
G
It's not possible for this to actually happen. Being conscience and acting consc…
ytc_Ugxf_fAog…
Comment
Who says it must become conscious? All that is necessary is that it becomes effective at some specialized tasks that are difficult for humans. Regardless of the question of consciousness, the ability to automatically perform the tasks would still be economically valuable, leading to widespread adaptation of "AI" and related technology (which is, at its core, a set of heuristics with many adjustable parameters that can be trained to optimize performance on some specifically-defined tasks with respect to some sets of data).
If AI is generally accepted as non-sentient, so much the better: then ethics don't apply to it.
youtube
AI Moral Status
2025-08-05T08:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKlUtbqhcjBouG5O54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfHK1FDg-UIKgE4fp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybVF8jIVrCyByirZB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzxsN6PigQnhLEBoRR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVPWOktWSX0WrcmWB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyd6RnYMGMpwSb2R2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0M_JOCCnnM2IY5ch4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiVCvHNFPbr2unAZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAd2Y6CFmg-gjaie54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx3P7jpxm0DsB6FqU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]