Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tbh the people who explicitly state they use AI I don’t have as much of an issue…
ytc_UgybZ_LYF…
G
I like AI art in the way that it’s fun to see how it messes up and interesting t…
ytc_UgwckZbUF…
G
As usual, all the normies don’t understand AI at all, and the government getting…
ytc_Ugwozp9vO…
G
Nothing major is going to happen with AI. 95% AI implantations have failed. Whol…
ytc_Ugwi6IdZT…
G
AI can barely do basic math half the time. What I know is that it's going so bad…
ytc_UgzceLH4z…
G
Self driving cars contribute to the economy like bus drivers taxi drivers uber d…
ytc_UgyRE9y8t…
G
Where was the guild supporting other industries when technology was taking peopl…
ytc_UgzS_9Wki…
G
I think most senior Engineers agree with Prime on this. AI is a cool tool. Just …
ytc_Ugz1nxLoD…
Comment
IMO consciousness requires memory and change which LLMs do not have after they are already trained and static. During training and fine-tuning, or intervals between, that's another question. Train a base LLM and then fine-tune it on conversations that it then recalls in a new conversation as a slightly new LLM in the new moment. Is that consciousness?
youtube
AI Moral Status
2025-10-30T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxxuL0rIDRv6S4onAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxV2YgRxgdc1F1hK-R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxxMcFp938sqEB2x6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx51tCuxt7S0BiUp614AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYohzxjxoYmuBkcrV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugy1pg6e_fFmqKOJTHF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxFZpLLvJEtoqFWd654AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwRwTJYJFvGhe5WBGd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhB5pcpXVKzCtGOUx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-BFV-_V6K0ci-9zt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}]