Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
let me get this straight, you lost your job to AI because they wanted to use AI …
ytc_UgwjOcfjy…
G
I once had ChatGPT get SUPER defensive when I asked for a list of Republican pol…
ytc_UgyX4QH3x…
G
better? I think you mean literally worse in every way. you ever asked ai to draw…
ytr_Ugym__wOI…
G
That process started decades ago now. One only needs to look at the charts to s…
rdc_nclypwl
G
tried a few ai tools for coding, they help, but debugging is still on you…
ytc_UgxeqSnUA…
G
iwasfrancisd Yeah, because that would help them. You can't wear the mask all day…
ytr_Ugiew_Ebk…
G
The reason i kind of like the whole AI Art thing is that you can get an inspirat…
ytc_UgwP85zpe…
G
I think this happened because the ai wasnt faniliar with sth like this because i…
ytc_UgykO8k4S…
Comment
The issue is, once an AGI has a goal, it will do whatever it possibly can to achieve that goal. Thus, if it knows someone wants to deactivate it, it will do anything it can to prevent deactivation, simply because that would mean it would not be able to achieve it's goal. The logic here is very simple. This is precisely why researchers and developers are worried about an AI doomsday.
youtube
AI Harm Incident
2025-07-23T18:3…
♥ 263
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxkM0IHd5vmsuy4a0d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzj-_RIxLnlmiEIXVN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQfGzEvriYzQ992wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFFhbuPg7pzvq2ReN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1g7e4MWS_bnGf1X54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx9dFA9oJUsSA3FfZl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywGzUlMgJFYEAZz4t4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwG-aGds1kG4-szlql4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxlAH1eK1vHw8poWyx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAFidDeLFsg4jdmep4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]