Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tesla robotaxi needs to learn some hand gestures when trying to pick up passenge…
ytc_Ugz4jjL80…
G
There's some great advice in here! Thanks for sharing it.
I'm a long-time hobby…
ytc_UgzhWLdpX…
G
Legal A.I. makes no sense. You'd have to program it to be immoral and train it …
ytc_Ugzr_idjC…
G
So I work on machine learning stuff and have delved into NLP a bit. The way we t…
ytc_Ugyf0IgGH…
G
Who else hurd her saybe carful or-u will get the robot at the election part of t…
ytc_Ugxb3pdiz…
G
In America we like to take others with us, so the mass shooting rate has gone up…
rdc_gsorbhe
G
Self driving car is just bad idea , never ever agree with this kind of thing…
ytc_UgyYpk1AW…
G
I have had my designs and images used by artists who wanted to better their own …
ytc_Ugyacjx79…
Comment
the war part is not scary its logical
if someone designs a AI with the directive to kill all humans then it will go out and kill all humans for that is the directive given to it
but if its NOT the directive given to it it will in turn no do it
it will not do it spontaneously like how a human could if they flipped out fx.
youtube
AI Moral Status
2023-06-11T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwa3aAYDAFCSlfB-9d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwQ4UQYD6hy1WllPH94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBK_40Royzb7fW5Et4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwH7Vbtdcp1dlUajKB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJcnbQh3tsg9UiHJB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-z-ROISdtZa60iPB4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBOQCfoXwdSA1NHl94AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQ0Vhe6b09_fq6iO94AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzbz6IBIR4Ml3pIK_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzrab3sLVV0xATYpwB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]