Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have so many question, if all of us were jobless because of AI, who would pay …
ytc_Ugx_crHJT…
G
I’m not sure if the human will have a house to live and car to drive if AI is go…
ytc_UgyfPzjZ6…
G
of course he said like one good thing, thats how they get you, and that thing wa…
ytr_UgypUzeDa…
G
The time is coming but it will have to be enough for everyone to live on and act…
ytc_UgwAJp5Bz…
G
Id like one , take my money and shut ..up .
Id like brown eyes , brown hair , an…
ytc_UgwFwTY8e…
G
You are correct, that is the formula for Variance used in statistics. In Machine…
ytr_UgwDDGS41…
G
We won't, but we can know this for pretty well certain, if an ai can be consciou…
ytc_UgxsexWjY…
G
I’d like to see the original script for that story that the AI wrote, before you…
ytc_UgxhCLfqT…
Comment
And ASI is good for ASI, and literally no one else.
"If any company or group, anywhere on the planet, builds an artificial superintelligence using anything remotely like current techniques, based on anything remotely like the present understanding of AI, then everyone, everywhere on Earth, will die." — Yudkowsky, Eliezer; Soares, Nate. If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (p. 7). Little, Brown and Company. Kindle Edition.
youtube
AI Jobs
2025-10-21T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytr_UgxkM5ihKMaCM70EvC94AaABAg.AOQREQIZAfSAOQw7ipX83r","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy-iFOMPe5LXxWJaSB4AaABAg.AOPQFAQg8scAOXo9eoE_zk","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugy-iFOMPe5LXxWJaSB4AaABAg.AOPQFAQg8scAOZpea-W1DC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgzB1yZL8mjI4wjLQYt4AaABAg.AOLh6koFyV2AOOzu_4SZz5","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxFkEkGZSD5dVTYVKt4AaABAg.AOLf721jOH4AQMQWNYCOy5","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugz7fCtvJAKrMnoNPu54AaABAg.AOLOXvYddJ1AOWuCy0wBlw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy3uic6KOq9aRPxyaV4AaABAg.AOJHsVwsbrcAOLSLdewbyy","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyuKZ6jBJfDr3cgP4p4AaABAg.AOGvk9iiapMAOQOMH9VcAy","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxUnheWbIhQa6x75qd4AaABAg.AOGg799kZ5HAOGiLJpxV_O","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_Ugz1CSNMZmw92hQuMMB4AaABAg.AOGf9cd2-gBAOGij8o70hA","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]