Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Pickle_Rick007just saw a clip of a robot attempting to hang rock. It shut itse…
ytr_UgyKPaaae…
G
People should understand this is not a fully autonomous system. You should alway…
ytc_Ugz9Tm7Jb…
G
If AI art is considered to be real art and better by Twitter people, then a dumb…
ytc_UgwstAVf7…
G
Bernie, more than a warning, what I need from you is your vision of the future. …
ytc_UgyAaIo9H…
G
The gravity of this situation is not to be understated, as the advent of Artific…
ytc_UgwenGEd2…
G
✔️ AI = deterministic routines with non- determistic out comes - But is manageab…
ytc_UgxZqMv0G…
G
Right now the human faced robots are wheeling around but Boston Dynamics has a b…
ytr_Ugys6dPXn…
G
What degree in art does this "professor" hold to claim things such as "ai could …
ytc_UgwII2maL…
Comment
I forsee a day when the "good people with their good AI" are forced against a decision that they have no adequate experience or frame to make. Reducing your opponents decisions and excessive pressure is a classic effective tactic. To sum my point: can the good people prevent themselves from being manipulated by maleficent forces. So far from the news it seems like strategies are already in play
youtube
AI Governance
2023-12-31T02:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwB2dRmdVubEoEiQMB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTXkcbhhFSGL-v9GV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyJav0EqmmW9f66BK14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy-S0O59sCiTaYymuN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz-716WHy6Z0_p3dVB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw7hYg-LFlofVCy-vh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzRjAYdoOOnbV-zS9d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugx23gBma6j4Yn8VghV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwQQGatK4YrfjvHUMZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_Ugy-eVsD13se8GIQtrN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]