Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@laurentiuvladutmanea3622 AI works like a bunch of on off switches. A series …
ytr_UgxI_4gaC…
G
I hate that movie, I had huge expectations for the sequel, yet they dissappointe…
ytr_Ugycq1_ah…
G
Big techs cannot lose the AI war, they will take their facility to wherever with…
ytc_UgzWxslzp…
G
There is no ethics to figure out, the car should just stop if there is no way ar…
ytc_UgiQ91bEW…
G
This is the goal of the World Economic Forum:
They want to establish an automate…
ytc_UgwUmxxx3…
G
@levacarvalho Yes. Art will become a hobby but the professional work will be don…
ytr_UgyRWS-aX…
G
I mean.. its not like the humans have had very different judgement in those situ…
ytc_UgwlQR4Bc…
G
I always find it funny how when AI "artists" try to defend AI, they almost alway…
ytc_Ugym6EN_p…
Comment
Basically an ai bot might not know what loyalty is so might not be predictable in battle if thirs a flaw in its logic.
And that likely will be their as if it can reason it might detect who gave it the orders is the real problem
Irony being if its intelligent enough it might also begin to not folkow orders.
Their thir might be bots fighting bots and their creators for tge sake of it
The same already seems to happen in politics:)
youtube
AI Governance
2025-06-18T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzhnD-aoLVOJwWCQE14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOK-HNfer1lCqtUTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTCjJvDcJfizyiiTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4tvp8lA47sAAu3_14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzPJVN1n17txBMkw3h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_U9m12_0fYGCeNfd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOHyv2K6zuQmXRf6N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw_x3TGll7g6rr3wXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOuVDrbKRCqCTQgMx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7OK3UhpRWrzaK8Zh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]