Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fuckin stupid to make a robot that can kill I hope the ones who made them are th…
ytc_Ugwc6t-x4…
G
Can't we just make the dangerous places safe from AI? We definetly don't want it…
ytc_UgzewE7Uf…
G
The thing is, AI right now is just a huge set of data. So yeah what you're afrai…
ytr_UgxxhWmmY…
G
🎉 The only thing needed is AI that helps people run their businesses efficiently…
ytc_Ugyu09IOK…
G
Don't worry. AI stuff can't be copyrighted. So I don't think studios will ever r…
ytc_UgzeKcqOb…
G
Although this AI Artist is just using AI he shouldnt be hated doe he isnt even a…
ytc_UgxWwQ3lr…
G
It’s ok, nothing that humankind doesn’t already do or think about one another. I…
ytc_UgzNoB4aB…
G
Private LLM. Done. The majority of jobs are cooked no matter what we do. We will…
ytc_UgyT58hX_…
Comment
I'm sorry to rain on everyone's parade about A.I and it's potential to "take over the Earth " A.I needs huge amounts of data to exist particularly if it becomes sentient, we hunans supply the energy for those data centres, we throw the switch and A.I is dead, in the human brain there is an organ the hypothalamus that controls non conscious activity heart beat etc we simply create the same thing for A.I a point that it needs to exist,one particular data centre somewhere in a deep cave underground that we can turn off if things get out of control,
youtube
Cross-Cultural
2026-01-02T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzyL9_a15qPcXxB9_l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzl7I0SW1Oep3LrJNt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvOR7ewj-51HV4rFd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvoID0RXdNCoJxrKB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzvWvlhg3gZsl_cR4l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyo-lCM0UUx34mLxih4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDNNX3Ve_pJL_5rBx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRz5EtFhsf7YhlMBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUTZrOIefrz7Q5NNp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTsecNgP3XyYcSb5x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]