Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For all the innocent lives being ruined by biased judges i am all for AI replaci…
ytc_UgzM48nRu…
G
Maybe you can invest in robots that do the work you used to do. You pay a fee f…
ytc_Ugw63Vs_Q…
G
Alex sounds like a parent after a parent teacher meeting... I can feel ChatGPTev…
ytc_UgwKD6Wqv…
G
LLMs can produce an output that's not average, that's why telling them to preten…
ytc_UgzDoruIK…
G
That's not the robot attacking. that's the programer making it go crazy with eit…
ytc_UgxZZS0fA…
G
1:56
This is an extremely incorrect view. Yes, the LLMs predict the "next token…
ytc_UgzdxU1GZ…
G
The problem is not the tech, it's the people in charge, you think grok turned r…
ytc_Ugw2Lqvuf…
G
You are brilliant and I love you! Your art is gorgeous and better because of the…
ytc_UgwGT_UVj…
Comment
The Three laws of Robotics:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
youtube
AI Moral Status
2023-03-04T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz7WqH3Ox0q6-syxwF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1P5t9cbHZpWDjlbN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyeUB1VXuaVE8vWW1J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQxF0tDC-BT89UArx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyqJJNUTOksC7hKqV14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzxLzjzbnFp4rb92Rt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlnFBeM_5uCTkXvaZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxQtZ0X_BxgkHDW7dV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwm0S-UkHceHsXR_mZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyYZl3Uvui7XBORhn54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]