Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've always wondered what a self driving car would do if it couldn't see a stop …
rdc_d1knpab
G
When will we fucking learn don't give ai the ability to control our actions or j…
ytc_UgzCBCYuQ…
G
980djsk What also Ironic that the people who praise A.I "Art" doesn't throw a fi…
ytr_UgwcHMjeC…
G
I get that! The interaction between humans and AI can sometimes feel unsettling.…
ytr_UgyI3YGgu…
G
nah, the cart pusher job will be automated because instead of hiring someone to …
ytc_Ugz_USuH1…
G
14-APR-2024 : I calculated that in near future, after robot replaces 80% worldwi…
ytc_UgzzYJUMV…
G
Gemini is something different here, at least in some use cases. It's the evoluti…
ytc_Ugw_o4ipv…
G
This presentation does not correctly factor in future versions of Ai and incorre…
ytc_Ugx1QQ46g…
Comment
You shouldn't have to coerce a machine into working. That's the whole point of machines.
I found this video to be reaching a bit honestly. There are many premises that are unfounded, such as spontaneous consciousness emerging. Any sufficiently advanced AI capable of its own agency will very likely be the one in position to decide humanity's fate, rather than vice versa.
Quality animation and narration as always however, and fortunately this topic's silliness is the exception rather than the norm.
youtube
AI Moral Status
2017-02-23T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UghMMYs7vZYX_HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghA63n1SdXZhHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgivYXIAt3XfongCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgiT6JNbcZ6dEXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg-KXhB9mN-JHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggGfE0wA4aliHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjrHFVNfked83gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggD-R2HrNwbjXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjdJpJ3U7MKnXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiA7vdNWiPVbngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"})