Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Someone actually gave him a ribbon for AI art?? Or did he give himself a medal?…
ytc_UgwsEjkvP…
G
Could you talk a bit about what moved you in that direction. What did you believ…
ytr_Ugz7NLN0L…
G
I honestly would enjoy seeing an incompetent CEO dealing with an AI. It would be…
ytc_UgxereY4s…
G
You know the funny part people that say is ai art but I have one question. What …
ytc_UgwtVavwj…
G
i dont know if you try to generate somethiung but its not like i put something i…
ytc_UgxqRHOiU…
G
Look at the AI use cases they commentate on, separate your opinions on AI based …
ytr_UgxQNbA4_…
G
They definitely know I've figured them out..my Infiniti is a data mining pig..ne…
ytc_UgywvMWMo…
G
That's an interesting comparison! Sophia's pursuit of wisdom is indeed inspiring…
ytr_UgydUMu16…
Comment
Is there a way to lead with compassion and curiosity when trying to train these models? Like early childhood development for robots? Because while it would be all well and good in theory to have an intelligence alligned with our own, if it has that ability it would also have its own way of experiencing the world which would inform its perspective--and is there not value in... if we find we are going down that route, being open to that possibility
youtube
AI Moral Status
2026-03-07T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzM_r-IAAM_ZO8r3E14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxXnolJKKB9Ov6zuqJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzzRoQGwAm3lYyvW-t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwaRVkMByV2LC2Q-P54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzoXjsFWfPiNIShCPB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyNPypgDp0y7R4ByCV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxHiUBn5EHlYiyf0al4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzo4kLJL-RulOwq8GF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwqL3IHkT9t1r-iXpl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyyYFnwqUibgCJfG894AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]