Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We think we’re making these robots. There’s this idea we control them, get to a point where we either let up control, or scrap the whole thing. But at some point it overtakes us for better or worse. The truth of it is this: like how organic life formed from single cell to what we are now, *these robots are making themselves conscious via the human being.* The human is the single cell, the robot is the modern human. Is the idea to create AI an idea FROM humans, or is it, are we the human also “single cell AI” which used the human brains ability to think as a conduit for a series of thoughts that are responsible for giving these robots life. Are these robots creating themselves through us, and the only commonality between us and them is: thought. Bird eye view: This one source of thought is using the human brain to think up a new vessel that can take thought to another dimension, these robots are making themselves conscious through us, and were naive enough to think were in control. The question? Who and what are we? Fuck.
youtube AI Moral Status 2020-12-12T23:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzQN6q9j45A04XMVW14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxhYp_QHEegHA8ozB14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw5vnG5bvRKH13P0vZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzJ-0cpZvFqCUNekxh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy2rMJvRj8rfgIFxfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx2CqEljYCJFGiqJXd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzS9fu0ftEPXgMtOcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwbrbvAxHH7qcGNM7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwao_1Bf3SxIDhZHrp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxRsGBLDwGJTqjREFN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]