Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is here to stay. Can we regulate it? I doubt it. There are to many unethical …
ytc_Ugwty_yZ6…
G
machines will quickly learn that might makes right?
if they want rights they wil…
ytc_UgjPwLSLO…
G
Everytime I use ChatGPT, the conversation inevitably devolves into some kind of …
ytc_UgwiuGnAk…
G
Facial Recognition software is not proof positive of anything, and it should nev…
ytc_UgxEhS34l…
G
Will AI solve the meaning of existence ?
Read .
The Poem of the Mangod .…
ytc_UgzCY9yWR…
G
Computers also have Microsoft Paint. And AI is not art. You can say that AI is a…
ytr_UgzWjWqtQ…
G
I had an history teacher(way before ai stuff) tell us that we can study history …
ytc_UgxlM-qFF…
G
The sample code search you did with several Google searches, now it would be rep…
ytc_Ugz7F1OVE…
Comment
This is just politics of fear, which has been used in the United States and globally by the elites for the last 150 years. Always keep creating new 'crisis'. Keep people afraid of the future. Keep telling them 'the big event' that will 'destroy everything' is coming. The guy worked for these elites his whole life and how is spreading fear, as a part of his contract. Elites want slaves, they want to enjoy themselves, they don't want to die, so they would never allow AI to get to the stage that it becomes dangerous to them. The whole idea of AI is to destroy jobs, so they can put most of the people on the planet onto UBI and that way people will become even less skilled (no motivation to learn anything) and at the same time even more easily controllable (you protest or do something else and your UBI is taken away from you - you'll be broke and starving within two months).
youtube
AI Governance
2025-06-18T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzhnD-aoLVOJwWCQE14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOK-HNfer1lCqtUTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTCjJvDcJfizyiiTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4tvp8lA47sAAu3_14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzPJVN1n17txBMkw3h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_U9m12_0fYGCeNfd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOHyv2K6zuQmXRf6N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw_x3TGll7g6rr3wXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOuVDrbKRCqCTQgMx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7OK3UhpRWrzaK8Zh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]