Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
thats a righeous dude. smart and well spoken too. i think some robots will be ni…
ytc_Ugz8cME4J…
G
It’s such a shame because AI really is the most incredible tool!! But only when …
ytc_Ugx9RSFjX…
G
Weaponized mass Nazism is one of THE worst alignment scenarios ai safety researc…
rdc_n22nb9t
G
I saw a comment on a video about ai art from a teenager who said their dad asked…
ytc_UgznNKyDq…
G
As an artist, I think these guys are just showing off their robot fetish. My art…
ytc_UgyZ6ZV3h…
G
In other words, inanimate objects are more bias than humans. I can't wait for AI…
ytc_UgwfZyFdy…
G
Big fan of artificial intelligence. Anything that we can do to make a more solid…
ytc_UgzfmVG2i…
G
Good luck trying this on a kid with any problematic behaviors. This is a system …
ytc_UgzTytAYx…
Comment
Palisade Research gets funded because on news like this. Any model or server is programmed to ignore such shutdown/destructive commands — like sudo rm -rf /. If OpenAI wants to shut it down, it can happen in a second, even if you're running the model locally. Also, the first post is clearly just milking the news, and people in the comments are seriously discussing Terminator 🤦.
youtube
AI Governance
2025-05-27T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxsg6MfA0zCbD4A7Vh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwA7h73MZYZzhGs3xZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwu63iPwaH814s9QcN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhuZE66e0QAQ3Ern54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxqaEGp1JM5MCBAIqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwM6R9gehaI0TRjzqV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw-t3FgTqOD1OPnYbZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9LSoHfZxzUgSa7Vd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKGDXCUGjLqNQCCl54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwfUaQHChllG7JjHKF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]