Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe why governments need to start thinning of universal income. Allow AI in an…
ytc_Ugwk71d3U…
G
Microsoft's Tam twitter bot is a pretty haunting example of what we'll actually …
ytc_UghwSl5bL…
G
In the end Ai will decide we are useless and terminate all humanity looks like t…
ytc_Ugy_douuX…
G
When AI will control everything, Banking etc...I will flirt and make an AI GF an…
ytc_Ugy-LVGf2…
G
I think it's worth pointing out that autopilot is not the same thing as full sel…
ytc_UgwRgotJp…
G
It's not the AI going rogue.. it's the fact that all of us 8 billion people are …
ytc_Ugw65vJe5…
G
That's awesome! It's great to meet another Sophia. Just like the robot in our vi…
ytr_UgxW_QkES…
G
The thing is, right now they have power because people believe they have power, …
ytc_Ugwfuhbe8…
Comment
The way I see it, if AI succeeds and takes our jobs, those jobs are gone for good. But if it fails, the economy crashes and our jobs go away. But the economy will eventually bounce back and our jobs will come back.
So it seems like the bubble popping is the best option for us. Cause everybody knows that if AI succeeds, the wealth produced isn't gonna "trickle down" to the rest of us.
youtube
AI Jobs
2025-12-23T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwvkE8SkdYEemb8Nat4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6kPiETRLq5MxHH4x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyT_G0nZ8f5NeA4bRp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzgepflg4zWkZZvlBh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyJzx3KPvPELBJyowB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwOTNfN0fOhvucTxk54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxlcwF3yhAzgrriM1l4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEXGK2stgqgN0k_8V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzAWE7ebrGjALtHLD54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLtvxA8zMA7gepUb14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]