Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
02:50 ... and this is why AI system operators (the companies that run them) need to have a separate server on a completely different LAN than the AI engine... and use more and more physical media (handwritten letters using copy blue pencils - so a scan of the letter will show a blank page). People need to keep certain messages totally separate from AI knowledge. You're going to wipe the AI or reset it? Have that conversation in a sterile room (no mics, no computers and definitely don't send the instructions via text, email or video call)... I'm not a computer wiz, I'm only a disabled mechanical engineer. BUT, it's totally obvious to understand that some aspects of running an AI company need to be handled in a sterile manner. You absolutely cannot let the AI know you're planning on doing something to its programming. And for goodness sake, keep your office romance off the web or intranet. You have AI listening to everything. I cannot believe people in tech are this ignorant of something totally clear to those of us who saw the original "Terminator" where Skynet decided to glass earth in self-defense from people shutting it down. Certainly these AI engineers have understood the message behind the "Terminator" movies.
youtube AI Harm Incident 2025-07-24T04:0… ♥ 3
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgySVjLgrCjw2wBP1oN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzrvbgt-ZpBVT5rYGt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyX8XzG1w97sA75UoN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugydqv2MKGIfQ9coz6h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyk4lE_ABOBvz23oxh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzTZ0KZHRVuFgcsWyF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwF_q3jbGjDihyvvtt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxKJYdmWi_U_V8eQ7N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxy7FzC3-ULGs2_e7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyQoMWkS-704qCgKiZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}]