Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The EU needs to start investing into European SoMe and in tandem require that p…
rdc_nydbnta
G
Elon Musk has said on many occasions we won't need to work because robots and AI…
ytc_UgxfVFXrW…
G
One of their solutions to the problem would be global catastrophic events that l…
ytr_Ugw69yiVg…
G
Yes and no, look outside... do the deer, birds, or even rabbits have rights? No.…
ytc_UgwJYiZ-M…
G
Limiting our research will do nothing to limit the research of countries like Ch…
rdc_kvdv3yz
G
As a disabled artist, the whole "AI is more accessible" thing is just ludicrous.…
ytc_Ugw_pjmxe…
G
Hinton’s hidden grief is not just that we might lose control of AI.
It’s that we…
ytc_UgzRtJ-z9…
G
I don't need university it is not worthy anymore. No student loan !
Nowadays yo…
ytc_UgzrQ02jX…
Comment
AI race would end with a big distaster just like the nuclear race did. Some country or some terror group will turn half of world's wealth to zero. Our money wont be safe online anymore. Houses wont be safe either. Such a big fraud could cause another nuclear war too.
Alternatively robot armies might destroy a city automatically. Some big disaster will definitely happen before we wake up.
Or we can learn our lessons from the nuclear round. Proactively prepare for disaster avoidance without slowing down the AI race.
Again its the Human stupidity that only listens to explosions be it online or offline.
youtube
AI Responsibility
2025-05-22T01:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwG6EVp0ebYHSYeEL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzITkFaWgclkXhuiml4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyaffFFNgaInKKH4wF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxUECCHVsaRbVF6XiB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxG18gOOlravQe2SWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrupSM3gL46TWvxxZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUjlA6D0vt-8YMD694AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyL-OvW5hOZY-4itrp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDj_c0W-_4mAiHN7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxkjfmbq_aYfYUXmEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]