Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Point the algorithm to a cop and it will always tell you 10/10 probability to co…
ytc_UgwkgR4k_…
G
Hardcoding an AI to say that it is an AI is not a guaranteed fail on a Turing Te…
ytc_UgwRXQItm…
G
BYE I WAS USING AN AI AND DUDE I WAS USING AN OC THAT WAS A MINOR AND IS JS A SI…
ytc_UgxSnqTyF…
G
My partner and I trolling Andrew Tate on character AI, and then having him get m…
ytc_Ugyg7rwYc…
G
Let's see what you can do in 5 minutes. Write it in a comment. No libs…
rdc_e7jc6xi
G
I think the essence of the argument “I don’t wanna be like the people in wall-e”…
ytc_UgwCRggt2…
G
Hard to believe the AI is presenting these complex conundrums in such a concise …
ytc_Ugwqtqosh…
G
I worked at a company that trained AI, and I'm a mathematician. This video is ve…
ytc_UgzO-RB8q…
Comment
1:40 "AI's are generally not eager to cause harm, but will if it's necessary to achieve goals, protect their autonomy, and survive"
Pretty sure every living thing on this planet does the same, even humans. They were trained off of us, after all.
youtube
AI Harm Incident
2025-07-25T06:0…
♥ 10
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyLBuYdeCxuuBHlokR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbrB7JBFSU-p1YPg54AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwTtn2kdY-gI-CtAc14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwl4XgUoMar3Gd6Fzl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwBO8Bg1iyi30tSAel4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx0C8d7xJsj6mmUCr54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw31tAS1ZvR5Mfsmx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9-1obd1zgF-_ylJd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzj4hC57AaUU0132Xx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwKFpN1r3YOl2EwAlt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]