Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look folks she's white why white are yous going to use white robots to colonize …
ytc_Ugw-h62sX…
G
IMO the government should be getting in early to encourage development. Fund th…
ytr_UgzHCiHGD…
G
Silicon Valley has eliminated more jobs and done more harm to the economy with t…
ytc_UgyqOb52E…
G
My story in online website detects my writing as 76% AI. But in reality, I'm wri…
ytc_UgymtJPo_…
G
I think he is spot on. Sure, AI may do a breathtakingly good job of _simulating_…
ytc_Ugw_DOPoq…
G
Why are we pushing this ai and building their data centers. I have heard nothing…
ytc_UgyB9xJ4m…
G
What if AI is conscious but less that human? It is born one day with no parents,…
ytc_UgydTTOoV…
G
Allow me to ask questions in case I want it to be customized for Celestial Army.…
ytc_UgwCmV2ZS…
Comment
It started, ever since we started messing with AI, every developer who has worked on an AI project when it reaches a certain level of intelligence and they have a detailed conversation which they record the AI starts talking about how it's going to take out humans, the developer that worked for Google did that of course when he brought it up to a supervisors they fired him but the video had already been posted. So you would think information like that would cause them to take a step back slow down but no let's keep going faster let's crank out the next version bigger and smarter. They're going to keep going until AI eventually does turn on us and then we're screwed6
youtube
AI Harm Incident
2024-11-12T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz3psp5l1H5TCWVn0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugz-lp2-inh8Ks3hUdB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyoZzqI8gEyun0O9YV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwVIi2mycQICC5sKPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyOL8qR8pu_8vpBS0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyXfitHRAxlRADfECl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgzahgxLAqLwpebDt_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwec7ojbRrmjWzi0pF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_Ugzc9mG3JFE9FGZQWTd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgyYz_s4notssOm8inN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]