Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Will you make another vid addressing the potential consequences of a singularity? A far scarier discussion, it would fit in great with your other doomsday vids! When AI surpasses the intelligence of humans and can start engineering themselves without needing us, the rate of technological advances and ai-evolution would explode. Sam Harris has a real scary TED talk on this for those interested!
youtube AI Moral Status 2017-02-23T18:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjfVoL_clccOHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgiPI-YOPMt3eXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UghKTXEJdE2k03gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgjzKBW0d4zvsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggzWaALjepZ8HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugi1-8Q9o8b7SHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UghZpuKPn1eld3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgizDdmtVR9s7HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugh9tM2DGn-Y5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Uggry-BHMQAuF3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]