Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not like you have genius questions. You have literally no value on this cha…
ytc_UgxD6q9j0…
G
I legitimately hate takes like this from every horror Youtuber lately. This is a…
ytc_UgzDrVnwD…
G
I dont thank ChatGPT. I put in its system that he likes donuts and give him donu…
ytc_Ugw3k6zHG…
G
WHO IS ( THEY )
THAT HE’S TALKING ABOUT, ( FALLEN L U C I F E R ) WANTING TO …
ytc_Ugw_JSsBC…
G
He was honest back then. He then flipped like Jekyll and Hyde and promoted AI. I…
ytc_Ugw7gr3jD…
G
They were making more money when they were literally leaking millions of gallons…
rdc_czm0xo0
G
I think future robot generation gonna curse sofia for their boaring history less…
ytc_UgwNiMcud…
G
the thought that my kids could go to school like this makes me so happy…
ytc_UgyXL5_Wt…
Comment
We already know all of this. Masses of people are being fired now... university graduates can't find jobs now... What are we going to do for those people right now?! It should be illegal to fire someone if the company aims to replace them with AI. And/Or, it should be the law that the company must pay for the re-education of that person to adapt to the new state of the market and support their job hunting. And, why shouldn't there be a people-first policy? That is, the therapist of the future should always be an actual person who draws on AI to help inform their practice. Might a non-human therapist actually be superior? Perhaps, but are we going to be as comfortable with such a therapist? The human touch will always be important.
youtube
AI Jobs
2025-07-20T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw9ya3PtDqC7paKFx94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHgPP7FqLxFJcBBmR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyc-QTjVhjXD35Pgdl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXjsrA7uKxWfAsDzl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxgRHOSLYGMfkd3oPh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyW3JwgB7JVrz9oNRZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyomMuw-8-IYMIlcY54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwJhQ9umM8HNPrN5wR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyvs3yA9tArZuCW3FJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz0SXU4nOcW42C1S2B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]