Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's assume for a moment that a law existed that required companies to pay a li…
ytc_Ugyp2Bc9K…
G
Zagyex at some point is it reasonable ask for an improved (most likely expensive…
ytr_UgwnAkU3l…
G
> It also would suck ass, because AI is absolute dogshit at writing.
Ah so …
rdc_kzmdnda
G
No, my precious AI please don't ban AI as my life depends on it because:Nowadays…
ytr_UgwaiZ5AD…
G
If you want to slow this down and stop this from happening to your town you need…
ytc_UgyYST7qd…
G
There's a reason why these models are called Large Language Models and not Large…
ytc_UgzA3ubUQ…
G
Everything sounds reasonable if you just use the word "*because*" and it doesn't…
rdc_m6z1pg5
G
Even if AI art could perfectly replicate the style of other artists, art is only…
ytc_Ugwz1uk-Y…
Comment
Ok. Automaton is taking what were good jobs. At the same time, elected officials are cutting the social safety net that protects the very people who are being impacted. Those who think UBI is the answer are not seeing the very human need for meaning and purpose that working provides. If we allow automaton to strip us of the very things that support our sense of meaning, we are killing the human race.
youtube
AI Jobs
2025-05-29T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | mixed |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw6mxl16m6KdC3keGl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgylWAtx6dyQR2GEi_54AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzh7LXxx-FwF-L0T9p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwzNp8X4syOqtScL8N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAY_nqzX7L2iiht_N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxbDr8mFjOrzET2VbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyqU-eyQPiBHxjZDrF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxN7jXcHcYY3ejXohp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwymg3p71OONUkIuWt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwWgqtLcvZ_FQKYXFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]