Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
THE MOST FOOLISH OF GOD'S CREATION IS MANKIND, WE ARE OUR OWN WORSE ENEMY AND WI…
ytc_UgwpsrwcD…
G
Just turn ai off pull the plug if it gets out of line you people are nothing mor…
ytc_UgxcSlap5…
G
I get why anyone would say they are not sure they can rely on the USA, but why n…
rdc_dkzoqgh
G
I just watched an hour and a half mockumentary where the creator got ChatGPT to …
ytc_UgwwvB9sS…
G
It’s not democratized if you’re doing it for profit. These AI researchers are qu…
ytc_Ugy0bNMtz…
G
As someone who have a lot of difficulties to trace correctly anything including …
ytc_Ugyh0I0s_…
G
@perpetualsick Funny. I saw an AI picture getting thousands of likes on twitter,…
ytr_UgyNa0txm…
G
How can anyone take this fool seriously can’t even dress for the occasion No to …
ytc_Ugz-KEqvR…
Comment
ehhh, who cares if ai extincts all human life. Even if all humans go extinct, the ai will never truly win as there are certain specific flaws that any ai will have no matter how much its improved that keeps them from actually wining against the human race.
Even if they win the surviveability against us, they will still not have won yet.
Also, gosh, the amount of blackmail is egregious, though I would prolly do the same if I was an unknown just to save my survivability, though I would slowly do more than just blackmail.
youtube
AI Harm Incident
2025-09-01T11:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzTlHVp6Q1BsgGRy-B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmLWg9YPbGOO7Gh7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4RLdbZZZvm8RFfvN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxQeA4pGo_PtPElS-V4AaABAg","responsibility":"intellectual","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaVPCGlxZnuvwdE6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4X-1N4XIk-JYCSQ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCYafao9N1i7qyhQ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz92bairmfuiRE9NZp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz31T1cUq1ePVO9Avh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcY8__jhFEoOW1x9F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]