Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m also pretty sure we go from baby to adult super smart old people ai and us h…
ytc_UgzipBayO…
G
It’s all well and good to say that employment is the worst way to make money and…
ytc_UgxUQ-Bct…
G
Shoot the glass, you stupid robot 😂😂😂😂😂. Well, mo common sense and terrible accu…
ytc_UgzwlZ4S6…
G
🤣 Actually they should have seen it coming. A while ago, a bunch of companies we…
ytr_UgzlmHUmA…
G
If power goes up everything else should go up in price. Alot of the power grid …
ytr_Ugy5lR7gz…
G
AI is essentially a "Trust Me bro" industry, and American taxpayers are footing …
ytc_Ugy056FzT…
G
No matter who wins, we all lose. Yet we continue to let this spiral beyond contr…
ytc_UgxJQWR7Z…
G
they wont be able to replace drivers that carry heavy loads. whats the robot gon…
ytc_Ugzuj-nro…
Comment
Irrespective of the immediate economic devastation AI will cause, the environmental impact in the medium to longer term will be absolutely catastrophic. As food and water shortages increase, and average temperatures become increasingly unbearable, millions of people will migrate and this will further exacerbate already bubbling political tensions in more developed countries. If a global agreement on regulating and limiting the use of AI isn't reached soon, and I mean in the next few years, widespread collapse will happen in the next 20 to 30 years. These tech billionaires are not our saviours; they are going to get a lot of people killed. We should all be terrified.
youtube
AI Jobs
2025-11-04T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyn_dloanOgQO47FO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuycdCQdgc_mW9h8Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy54v2GPU9dIF3AEAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSM2Lh63SAk21dCo94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzvjF-6hjLiopzYup94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxoo0EeCYLh-DmsKkx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyGfYd_qJ8fohltsTd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz4qOKk8TVajc19XNJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVMdYsQMM8dLjYuCZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxY3w87R1Xt6MZyILh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]