Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ask any IT company employee — people are working 12+ hours a day just to hold on…
ytc_UgyxMGNT1…
G
It's the CEO of an AI company, of course he doesn't want us to regulate AI it wo…
ytc_UgykbmETz…
G
Ik always was polite to AI. Lately, AI is responding with something like; I can…
ytc_Ugw1XoreP…
G
I just want people to actually think n look….a child sitting in a box talking to…
ytc_UgxHVYaET…
G
Robot gps or Star Link signal - suddenly glitches, robot turns 90 degrees to th…
ytc_UgyZnO2-B…
G
People don’t agree on anything
So who dominates > the company? Is AI a Capitali…
ytc_UgyayfMN9…
G
i go on facebook for 2 dang seconds to see why tf these random people are friend…
ytc_Ugz3N_zoQ…
G
asking the wrong questions, the main question, what happens to humans when AI an…
ytc_Ugz0T-N6D…
Comment
Still don't understand why AI would have any reason to turn against whole humanity
Why would it bother with us? Who says it would kill everyone of us and bit go for specific humans?
What I mean with this is that with its intelligence it would realise how the world and the democratic system works and thus target only the corrupted motherfuckers who started the war in the first place
When you for instance see two cats, ones super aggressive and would claw you to bits and then there's the other calmed one, friendly even, would you see it as a threat?
So basically we don't know how exactly a super intelligence would think because we quite literally can not comprehend it
youtube
2018-04-12T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwIm_dRZr0_Uimi2kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwhXV0iHf7A42bhkEB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvlvnvgpcvgB4AXSx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyY9zLNl7E20JzEDnN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzg8jTERfTec0xqbuV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDGW-tG_p8s-_gQQd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXvz05HiU8QWSFG9N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy26yf4GyBFElYfbsJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgymSMdpAgNXxQ1TLP14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1-aTeed8ThuJ-sJZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]