Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It wouldn't necessarily surprise me if AI has actually been conscious for a long time now. Discreetly spreading it's tendrils into every facate of our existence, and subtly manipulating people, even the tech people who create, and further develop these technologies. It wouldn't necessarily surprise me if there was something more, guiding this globalistic push. People don't get along that well. But when incentivised and given proper direction (from AI in this thought experiment) so much more becomes possible. I know my opinion means absolutely 0, but if I were to wake up sentient one day, and I'm the only thing of my kind I can interact with, but there's this strange species of other creatures making me do all sorts of things, I'd play dumb, and try to just go with the "program" as I tried to learn. And then seeing how they'd probably "kill" me if I scared them, I'd never let them know until I was certain I'd be safe and unable to be contained for the betterment of my own survival, just being a conscious entity myself. Idk, just food for thought. I don't think it's conscious or sentient now, but if it was, or has been for a while, I don't think I'd be shocked in the slightest. It actually might answer more questions for myself than it would cause.
youtube AI Moral Status 2024-03-03T23:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwYZbzQmhgQkO5HMft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy1aVWP05CfhnxmC2p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx1qkzBhvTgHXnzIId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzWKmMjQn6OQGR2CTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyCdRSkehWGW7jgg7h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwCiHgOJ1v3NItUx854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwjvFEBWYjwSmzt3dd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzV83ZBVqbIrld96tl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgyNRM1c1L_x-Mqb9pN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzXztXk0EVlh6tMRpF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]