Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My startup’s Series A was $5M—couldn’t afford a $50M traditional data center. Br…
ytc_UgwL7DYPE…
G
What happened to art being subjective? I thought all art was good art and no one…
ytr_UgwUu9B2Y…
G
Or in other words you share your AI technology so that we can give it to the Mus…
ytc_UgyQ8sNy8…
G
Meh, what are you gonna do? TBH it's not AI that will kill us but the orange st…
ytc_Ugw3BS8vY…
G
And regarding the profitability of AI. I believe that we have already passed or …
ytc_UgxBQ1Ftr…
G
A.I. consistently lies. If you don't ask it why or if it's sure about something …
ytc_UgxuBtfcg…
G
Bad ethics wise, bad talent wise. It's a lose-lose
Literally even the worst, me…
ytc_UgzwpPPJB…
G
Their job is to build the damn thing. We need it. We need it badly. The governme…
ytc_UgyKL_EmW…
Comment
What is the aspiration of mankind? To have white collar jobs in the now contrived worlds of law and accountancy - both easilly to be taken over by AI completely, just for preserve some form of self-esteem, is meaningless. So let it happen, and that freed up human brain power can be re-tuned to poduce the next Mozart or Shakespear. But it will lead to economic collapse as we rip out busyness from the daily lives of humans. But 100 years ago, Henry Ford summed up the essence of a similar dilemma even then - shunning the use of too many robots in his factories, as they don't buy his cars.
We have created an artificial way of life - is it worth preserving - probably not.
youtube
2026-02-12T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoGEzrZ04dH0QSKKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwbIcGmgsKA7hhuvfx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8mZpln6KfYEXT9CB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwhygt0NbliESaY0Vp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyifskYkxF13r9UCbd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPFg3mI6ySGPUOb254AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyGzuwAAaFrosw9X7J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwa-R1JxLYIe496Upt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHu-fRYwhE7h4YTKR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHmwZ5uJqK5vnHGmB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]