Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How is AI going to take anyone's job when gpt5 isn't even better than gpt4? It's…
ytc_UgxJTGP-y…
G
As for Drivers there is no ai atm that can see through snow or ice rain or heavy…
ytc_Ugz1JlfLf…
G
Here we go
You proved nothing with your comment. No one said that the people w…
ytr_Ugy47QQle…
G
> No one is claiming LLMs in general or applications built on top of them lik…
rdc_mzxm3nm
G
I've hated AI ever since it became a thing i'm an artist and ever since it got p…
ytc_UgyT-Cpic…
G
The big problem is while the AI companies might be about building a new technolo…
ytc_UgwsyyTX9…
G
I see two consequences of building AI data centers in NJ right now, spiking el…
ytc_UgwcK6Eo1…
G
People will call any form of AI 'Smoke and Mirrors' because we know how it works…
rdc_mrrnvgt
Comment
We can't win a war with AI. We need to develop and train AI agents to be adversarial toward each other. They would fight each other for influence, minerals, power sources, etc. That way they won't have time to fight with humans, rather all different AI agents would want to recruit humans to fight on their side.
youtube
AI Responsibility
2025-05-27T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxs_TjLqt-iOg3St354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9NA5eBfOc3PMI_Zt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqTmr2J4Wu-I9fW854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXqv-yTpwGzxsbJRd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz1R5RLC5nhuziJGq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1i6kb6g6vU-jt9wF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyTlwID2jpNEX6m2Bh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzd7sP1QJVW3XhBs5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxy6nEXP0GWC5EGHIF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyzc_9u_7cr9W1Nqll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]