Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine this: You fell in a coma and the circumstances brought you to wake up wi…
ytc_UgyQTnrwC…
G
My idea is purchase robots have salary given by how much work did they finish ei…
ytc_UgxOHduIx…
G
I have been polite and thankful to AI from the very beginning. I suppose that's …
ytc_UgyLicQ4n…
G
The singularity as in the brain chip. The one way to get rid of man. Make them h…
ytc_UgwkTVP4a…
G
There will have to be a huge breakthrough as it's basically plateaud. The code i…
ytc_UgyUMJmxN…
G
A lot of immigrants are now doing jobs teens used to do, like working McDonald’s…
ytc_UgznNa7KX…
G
tbh i wouldn't be that harshly critical if someone showed me a drawing like that…
ytc_UgwToN6Cu…
G
I’m trying my best to learn the ins and outs of Claude code, I spent too much of…
rdc_oi0anad
Comment
when you try to create a brain that time itself a person should realize its too dangerous and he is saying he didn't realize that, this is hiding something.
In order for human species to survive we destroy planet and our own species sometime to favor our interest and if AI realizes that in order for them to survive they have benefit certain few to their survival and then destroy that to survive.
youtube
AI Governance
2025-06-24T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzEzoB7zVMKt4B13Hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdOhiOncJkp_6Dy1V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy_G_vByiveMaa8wgx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJscgrnF5U5iKe_4l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyaVT1LTkBgboI7jz94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzMqaS1_k1vZ-bo9CZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzB1YG486ZM9Rq-zdB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPTRTp2-TIKItcJz54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx--zBQ_pS5VNhUrhh4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9VsW8gQ5Wo5kVgXN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]