Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Because there's never been text out there talking about awakening which AI could…
ytc_Ugww6S5dV…
G
Not really, they're expecting to lose 14 billion this year and they would go ban…
ytr_UgyT3nca-…
G
@ratatouille7449 Non, ce n'est pas un simple programme. L'IA évolue, elle change…
ytr_UgzxFo5y8…
G
great speech, historical. i agree with bernie on the attack on elites and ai, i …
ytc_UgwX_qnYU…
G
(tin foil hat time) I kinda wonder if another reason corporations are pushing AI…
ytc_Ugyo9S5Hm…
G
It won't be a cruise ship. Look at any area of humanity where people don't have …
ytc_UgwIO4V10…
G
The problem with all of this is there’s no way to confirm that the data hasn’t b…
ytc_Ugx4KY6sb…
G
Welp guys i got chatgpt to tell me about area 51 files so I'll try to respond to…
ytc_UgxLJS3lc…
Comment
AI has become several orders of magnitude more energy efficient just in the past 2 years. We aren't expected to hit an energy bottleneck for training AI systems until about 2030, and we're expected to have AI that can do AI research before then. Once AI systems can autonomously train their successors, we are well and truly cooked.
I recommend joining PauseAI or another such organization to push to slow things down now, rather than resting on the hope that the exponential curve will stop being so exponential very soon. (Like it was supposed to do last year, supposedly.)
youtube
AI Moral Status
2025-04-27T04:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugxo2GjUyERtyIV9xA14AaABAg.AHORAZC0DOLAHQQtQNdBpB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHPT2Ppj-BH","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHQZVsu6sZu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHUZ97MRZWl","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOR6mKPKOh","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOTM_eqCs7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOzGIUf6B2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHP-2c4TwGz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQB9eajQfs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQFM9hvrDb","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]