Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ok, as for the question of how advanced AI might kill all humans. Here's the thing I, came up with a very plausible answer to this question. I was about to type it and realized that that wouldn't be a good idea😢! I'll tell you though, another interesting question came to mind, if super AGI was going to make decisions about life on Earth, it occurred to me that removing humans is one consideration, but would Super AI, consider removing all life from Earth, and why? There's a hypotenuse that's SAI, might consider removing all oxygen from the atmosphere of the Earth, to remove oxidation/rust as an effect on a planet. Which in itself as a human is extremely terrifying! It seems the best bet is to hope that we can all live together!
youtube 2024-06-16T05:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyY_8W2NHA3-iLHw3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwfuLTgVUoQaHVETpd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgycSLpuMuZzZmCN7p94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugypybs6otRb8oPRQV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzZzYpW2Th7hoRYqIh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugw2LjoLgsnb2Lzizz14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_Ugw-9tnkRZiyZm8hsSF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugwgz--wnqVXh4vrpB14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugx2YKBxnZGTBPpuXhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwx2Zz34hKV4v-oNXt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}]