Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Umm what if the person that got the parsell from the robot steals the robot 😅😅😅…
ytc_UgzHLrkYU…
G
How little hardware/systems would one side need with AI need to beat the othersi…
ytc_UgzQMH0Y8…
G
Thats a new thing that started in the 1980's. It was to maintain breeding, back …
rdc_cdlzp7a
G
AI doesn't think, humans do. AI is very good at retrieving facts but is incapab…
rdc_n7tda50
G
Those who own the machieans will be richer then gods. Those who don't will die. …
ytc_UgwDZ-Phi…
G
AI-Designed Materials
AI now designs matter itself.
Instead of:
“Let’s test ma…
ytc_Ugyiw7sad…
G
this is what people miss in the AI-will-replace-jobs debate. the tools raise the…
rdc_oag5let
G
And because this channel just couldn't resist, we have an anti-Israel, anti-Jewi…
ytc_UgxuH2K-I…
Comment
for creating a robot there should be restriction such as they should never cross our command.one day they
Can be more advance than us but there will be singularity,many jobs will replace.
There can be human extinction because of robots,they want to rule but if we can't control them it can be a problem because it's AI it can think on its own
youtube
AI Moral Status
2020-09-16T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzmq9GCKTbJGE5qf5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyF0PQvIdCf8OmQsld4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwxv8vRYLEqBfzibBV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy_r9tKM71kMlCfjl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzgLu63oqjaaYoTP6t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzo55Z9NljII5HswId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzzrf-DwsqccnC3USN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyLMznhgI9pjSkSuqx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXp6lWcLmthO2ggLF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugweb0APd_1eR9z8MkB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]