Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is not the magic bullet many think it is. But it can help.. monitor & moderat…
ytc_UgyAg691D…
G
I think the biggest disingenious thing here is that self driving cars would be l…
ytc_UgzxaJowp…
G
Actually I prefer this model, AI led and assisted model than hiring and firing c…
ytc_Ugy85Y8cw…
G
Black Jitsu if it's a self learning robot I'm sure It can emulate to perfection …
ytr_UgjWtK98d…
G
"sexual orientation is strongly correlated with certain characteristics of a soc…
ytc_UgwIFEfyA…
G
A recent AI robot tried to copy it's information to an outside server, then lied…
ytc_Ugw6Od6z9…
G
It’s because they’re jealous, untalented hacks, and this “tool” lets them shortc…
ytc_Ugxeyqr8O…
G
There is at least one good reason only a few compare results to doctors' diagnos…
rdc_f1echk6
Comment
might be that we don't have a 'conscience' and we are just robots, because conscience feels like YOU make the decisions.However, this also seems impossible because if we did, what makes us have that decision?if its nothing, then that means opinions are made out of nothing.but if its entirely based on memory and health and the surroundings and programming, then that makes us a robot.This would mean that 'free will' is an illusion,everything you do is from your surroundings. and that we technically, by accident, already created a conscience-computers, that we dont respect like conscience but is still a form of it.What if we are the same as the computer beacuse how on earth can a decision be made out of 'opinion' and nothing without breaking any laws so far
youtube
AI Moral Status
2017-02-25T09:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjR2zO_1LwfgXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggOs3HwjLeo6HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggzjEvQA-SVuHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ughj52dn57v5_XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghQ9UQVYlM32ngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjIXkiz05yonXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghVxTy-agwO-HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghVIe6nF4TwM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugh_UzizPwht13gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugjn9CpVjJQB5XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"})