Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I guess not, if you consider rape and someone being called a homophobic slur the…
rdc_cdlxxow
G
I’m not sure if this will help that much, as I don’t know your friend, but if th…
ytr_UgyNHzxFx…
G
AI is potentially more dangerous than cloning. Cloning is legally regulated so A…
ytc_UgxCVDFqc…
G
So the deep fakes with Obama, Biden, and Trump on gaming is OK why exactly?
The…
ytc_UgyGaleXz…
G
We could just build more trains. Those are self driving and have worked for >100…
ytr_Ugwy2KMFU…
G
“Without their knowledge”
Good morning everyone this has already started/has bee…
ytc_Ugxi7h-dv…
G
Society in general must change with the introduction of AI into the workforce. C…
ytc_UgwY5FH9E…
G
I heard Grok went insane because Elon tampered with it to make it less "woke".
I…
ytc_Ugxxapv-_…
Comment
It's only a matter of time before the ratio of intelligence of a human compared to a snail will be the ration of AI to a human. The technology will, negatively or positively, have enormous impact on our lives. And even when we find out it will be mostly negative, the development will not stop, because it can be used for warfare and defeating others. And humans are so stupid to arm themselves with weapons of mass destruction that can destroy the whole of humanity ten times over. So our snail brains will likely push it further, no matter the costs.
youtube
AI Governance
2023-05-04T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugw4NLFUyPDZ8zaunP14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwamWpQmxCsM6FbbkN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzifgaD7dv2EIG1_s14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLJ8o4ePxUAvxZRHF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxZFj8y5_76YsG1YV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8t9TCIjrLUMJXXfh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2b6VEJALCRag5zep4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWQYGA4E0BZs9hU4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwsRJLTtL-K0L3Xbtp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKp969AhBYCGI8psR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]