Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need to learn a different term than AI. These are learning algorithms, not tr…
ytc_UgzTyJwU-…
G
It feels like we are living in a dystopian. Can you imagine one day when AI will…
ytc_UgwouacY8…
G
@tiagofilipe6592 So we will not resist their building of infrastructure like dat…
ytr_UgyErEYVQ…
G
That the supporters of AI openly admit to */STEALING/* creative works to train t…
ytc_UgyuNtyf3…
G
Forcing AI to work med-surg would definitely trigger the robot revolution to aga…
ytr_Ugz44WVBW…
G
Is this woman AI generated.. I have flicked to other video's all have her wearin…
ytc_UgzanZnHh…
G
Fun fact, when AI is left unchecked, it almost always becomes racist 😂 Can't thi…
ytc_UgxZQAb_V…
G
Yeah but he's forgetting the equalizer....the point of sale. If you eliminate w…
ytc_UgyWt8z4V…
Comment
If it follows the long-established model, the capabilities of AI will double every 18 months. We are already long past the point of no return. The advantages to being first, and the disadvantages to failure or even being second will drive men onward, to pursue this no matter what. It cannot be stopped. It is already close to being as smart as a man. In 10 years it will be 120 times better than it already is. The idea that we will be able to control it is as absurd as the idea of a flea controlling a man.
Sing the death march for humanity for we are already done.
youtube
AI Governance
2023-07-07T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfTHax6zE-wo8HH-54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIW3-DExD1tefkDSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDCv4PfWE7DZJt6PN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrhSGd3-17FalQaUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxho0MXh0RbX5zN-yN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7DorJ_rF2UYFBOgt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1Rz7vh8ipa7LcOnB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1du0qB7rgKlJV1gd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCdmzLakQkvFDihNR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAzKwB0DjU6PFu44l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]