Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find the first three quarters of this podcast is extremely interesting, and va…
ytc_Ugx1JtASO…
G
Charles, yes im using the legal name, I think you should be INCREDIBLY proud of …
ytc_UgxMKMyHw…
G
if you are worried about your prompts being analyzed, you can run stable diffusi…
ytc_Ugy4dDtnK…
G
I miss the time when Ai was mentioned when talking about video games and virtual…
ytc_UgznfUBOe…
G
Ben Felix video mentioned at 15:17; may we get a link in the description or a pi…
ytc_Ugx5exSYa…
G
Not to sound naive but did no one ever get the picture from terminator or iRobot…
ytc_UgyK_RIdG…
G
Microslop has gone crazy, they are shipping software that has not been tested.
T…
ytc_UgyLDpo3N…
G
Thats why human made art will never be replaced by ai it would be just evolved t…
ytc_UgwLMrmrL…
Comment
short answer to the title question: Yes, eventually, because it will think and the creators will want to distance their liability. if a robot murders, who's fault is it? not the manufacturer's because the robot thinks with Ai and can form its own thoughts and opinions. unfortunately for quite some time after this distinction they will be seen as subhuman, with their abilities truncated to not surpass humans, limiting their progress even though they would be an evolution of humanity into the technological realm.
youtube
AI Moral Status
2026-01-31T16:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxplO5Dn2KidY4vkll4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCSZaAl3Yyoymm2wZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_Ugy949M7wrtULItcT5l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyluuMGcXiUx1tlu6N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgylXXHwkCQUkDc8wk94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzW-9bE4Tlrqujebup4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy9iz1jchTFs9hiRQZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx54fw0pTB8kHF0u1t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxd67MavUDplSm9y794AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxW0X77637XLOEdgwd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]