Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Companies like Remotask have been frustrating us; we work hard, but when payday …
ytc_UgxSSaUgo…
G
If at all it happens then people wont have money to spend, demand will plunge fo…
ytc_UgyPvIHvP…
G
Thanks! The speaker placement really helps create a more engaging interaction, d…
ytr_UgxGBwp3y…
G
I don't think people will stop exploring completely, but AI definitely becomes t…
rdc_ohrsuuq
G
When I heard “Sora AI,” I thought of Super Danganronpa 2. (Our MC is an AI chara…
ytc_UgzS0Z0k8…
G
Only people who are totally detached from reality believe UBI will ever happen. …
ytr_UgyHDrdxb…
G
Along the same lines, and they didn't mention it here -- AI audio. If some talki…
ytc_UgxHgGLrl…
G
I agree 200% with you, and I've been arguing that with tech-dudes in other socia…
rdc_kz36359
Comment
Wrong in this one. AI is advancing insanely fast. I mean it went from super crude time consuming tasks for “meh” results to you cannot tell it from real human work in many cases in under 2-3 years. And AI is still in its fetus stage, it’s not even fully developed.
THEN once it gets paired with Quantum computers which work millions of times faster than current x86 architecture, at no point could humans ever compete with that.
Andrew Yang was speaking to this and why he was pushing UBI last election. AI and robotics will replace need for humans so you then require a UBI system…and of course depopulation to go alongside it.
But DAMN will you be controlled.
youtube
AI Jobs
2023-08-01T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyqtRAggLWz3a8wse14AaABAg.AE-eIgMpXM0AE1gBN9Oi4x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx8Iuikw_ztaNN6Ogp4AaABAg.9t6Ps2VXiS_9tn3NbkX8JV","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwykREwGSp_0ITwhJ94AaABAg.9ssaZwcsBpu9t5iuebCyHL","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgylQIG4PgO99OqNCzd4AaABAg.9ss3OybYm2z9ssKfje1wOV","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgwbD9HhOX4hrKZAL654AaABAg.9srr8dnrpdu9t5jROjaT2t","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwQkLJHW4a5fTjNkM54AaABAg.9srdGwAZXmF9ssaSZxx2Gv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwQkLJHW4a5fTjNkM54AaABAg.9srdGwAZXmF9sseMVWsurt","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwf4p1ZCNun-NiNlXZ4AaABAg.9m5luuH91lH9mc477dBaTR","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxGXm09_M2pxvx9x1F4AaABAg.9lTPhmv70CdA19LzBDjPT1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx1KtLQQK5lmbVK6FR4AaABAg.AMRhL-FaeR-AQCUJbXBUmQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]