Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do these guys promoting AI always seem so shallow and like they don’t have a…
ytc_UgzDEMYjq…
G
If AI succeeeds, billionaires will build police states staffed and defended by r…
ytc_UgzfraKuw…
G
not sure I agree with this. I find it most helpful for things that are easy (or …
rdc_mmc72p4
G
Opening your company up to cyber attacks such as this scenario makes it easy pic…
ytc_Ugz9ZYVC8…
G
I wonder if you would say the same about intentional ai creations like with cont…
ytc_Ugz-w4ZJ5…
G
10:35 no you won't, because google is getting clogged with AI garbage. You have …
ytc_UgydbzQVd…
G
Yea Dave, just scream and cry "no no no! That's not how it works!" and than you'…
ytc_Ugzedx826…
G
What a crock of sht. I can't believe how gullible people are. How much money did…
ytc_UgwtIdy-b…
Comment
Personally I say no that's just because robots with consciousness or not are they able to die forever or not. A Humans limited time in existence and the fact we share that with everyone made us come to the conclusion that this time is precious and everyone deserves to live it at its best hence human rights. But if conscious robot can live forever copying its brain and able to adjust its memory won't really be bothered by the rights of humans and how they are structured. It will probably demand its own rights that would probably be meaningless or dont apply to us human.
youtube
AI Moral Status
2021-09-15T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugyf_bO2asv8kWUMooJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxoGizrSJGd69Ww7SR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwK5pIhNNLTEtTuZvx4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxQiDR9UUzI6k8qz714AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyplfPqxUg_by6pEHZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyS8oRGGUUlr7EZind4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxT7om6Gyox59JIqtF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz23-ZDpdBM6tF5-BV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyYrkufeBe94yC6GM14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxleEvsw4SSYqmr_YF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"})