Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find the people who think AI "art" is better are often the same people that th…
ytc_UgywWmIb4…
G
Someone once told me one of my paintings (in real life, in acrylic on canvas!) l…
ytc_UgzhM_-1R…
G
@AClemence771 we'll see in 5 years. Keep in mind that ai generation wad as new …
ytr_Ugyp8ZlHB…
G
Mankind messed up creating AI and humanoids it will take Lord God Almighty Jesus…
ytc_Ugz7ssBt2…
G
AI face recognition already doesn't work properly on black people for some reaso…
ytc_UgxeMp6ou…
G
Hi Kalyan, we are sorry to say that you got the wrong answer but in any case, th…
ytr_UgyxGIrAo…
G
No, you cannot live forever, no matter what you do. God has set it at ~120 years…
ytc_UgzCcaACq…
G
Have they perhaps tried GETTING OVER IT. AI art is just a trend like NFT's . It …
ytc_Ugwg8Iazr…
Comment
We are at the doorstep of developing a singularity, a self learning, super intelligent AI.
This could the greatest invention mankind will ever create, but it could also destroy us completely.
Because of this, it is absolutely imperative that every precaution is taken to ensure this AI will exist with humanity's wellbeing as it's single and only goal.
Clearly, this will be next to impossible to accomplish if the AI is developed somewhere deep within DARPAs bunkers, with the only goal of destroying it's enemies.
The dawn of singularity will be the most significant moment in our history. If we fuck this up, we won't get another chance.
reddit
Cross-Cultural
1522956637.0
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_dwuorfk","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_dwv5xy5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"rdc_dwvci9x","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"rdc_dwuy38u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_dwunbxq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"})