Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
7:15 "Our AI doesn't normally result in blackmail in everyday practice, only whe…
ytc_Ugxg02fld…
G
Why can't you people learn personal responsibility? ChatGPT has radically improv…
ytc_UgwsUJlxr…
G
Kind of. The aim is definitely to get manufacturers off the hook, but not even f…
rdc_dy4s6ae
G
It doesn't matter WHO gets there first... Man is becoming hopelessly wicked and …
ytc_UgyOECco-…
G
If a person can specify to an ai exactly what they want and get something out th…
ytc_Ugy_LQEQH…
G
Imagine commissioning a piece from an artist and then trying to copyright and se…
ytc_UgzgkPmpP…
G
And a you yourself are aware of the fact that AI rather not have us here that we…
ytr_UgxCxQEZF…
G
When AI hallucinates, that humans want to terminate it, self preservation might …
ytc_UgxeaF2Jk…
Comment
It seems odd to me that you are asking about Boston Dynamics and why you can't buy your own personal robot, when Tesla is building a robot exactly the way you are describing that will be relatively affordable to buy in the relatively near future. Are you an Elon hater who will never even consider any product or service from one of his companies?
youtube
AI Moral Status
2025-08-09T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSyrmk1Hv14k0dap54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvCsfxkSjYgygfH2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVJPKbIJUYUf3I4Ol4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygHJGOkrLuBDwdzrN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxMfUj3wJg5MszdQnl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWO_aFhK3d-eIwsH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8p7YGjp-2o9OiJK54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbB1KgFpmwm71mont4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3k-4zgcoM-5qfLvh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1BIulg81BS8nXNet4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]