Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What are the odds of Euromaidan developing into full-on Syria-style civil warfar…
rdc_cfks86a
G
We are screwed in the future.. Nobody is financially safe.. This will be the sta…
ytc_Ugwew2o8J…
G
@nathanielbass771 Which company?
Without permission? You realize human artists …
ytr_UgwEEmeOA…
G
Elon's pause in response to the question " what will be of value ?" says everyth…
ytc_Ugxa4gSJ7…
G
"I do not want AI to ..."
But it's irrelevant what you want. It's like saying y…
ytr_UgzmfVmH1…
G
When I saw the robot attack his developer, I already knew we lost control. You …
ytc_Ugwq2y586…
G
Now all of a sudden everybody in silicon valley is making rogue ai bots or progr…
ytc_UgyQk-dIO…
G
Oh my God ChatGPT sounds just like an annoying tech bro podcast host. everything…
ytc_Ugxe2aYoD…
Comment
This was a particularly disappointing piece from The Economist.
I have come to expect more analysis and less assertion.
AI is promising/threatening to be the greatest change to labour and productivity since the Industrial Revolution.
What has happened over the past three years is a poor indicator of what will happen in the next 30.
Job losses are mounting up every day. This piece is wrong.
youtube
AI Jobs
2026-02-25T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyCiS0Nj4ieZWv5Uqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGrx4xUhmo7mjKxDx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrmAyeQWOZZqpKVnh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxiu8v-9F-2d9IBwV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxXV2xEegmcbIzkL94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGdTujqvT8ZaUX1Hl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzatM4-GOFog0kAOR94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw39WcPOyizwxD-gY54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUcMWKdS9JGO6iQSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8rWkQDQ6XJ1Tq6EF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]