Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope you feel better soon! Watching Sophia interact can be quite captivating, …
ytr_UgweQDvgl…
G
But… as Gregg Braden has said… AI lies… how can one discern an inaccurate answer…
ytc_UgwlopD8K…
G
AI is not hype, insofar as it may turn to hurt people: If we think of it as a mo…
ytc_Ugxz-BAbI…
G
You are literally using GPT-5, why not GPT-5.2 Codex Xhigh with Codex or Claude …
ytc_UgyD26D8k…
G
Time to get out your hardcopy of the Holy Bible, rather than relying on an inter…
ytc_UgyqT2oq9…
G
Damn, I was looking through your socials to find the nightshaded image so I coul…
ytc_UgxPKy2gN…
G
This is nonsense. There is no evidence to suggest that AI will lead to mass unem…
ytc_UgzudKe8N…
G
If russian hackers were able to shut down a pipeline and cause massive fuel shor…
ytc_Ugzw4XxLs…
Comment
I think AI and robotics are going to automate what humans can do (provided everything goes as intended) until it bottlenecks at something the robots can't quite manage and that'll be our job and we'll go through that process stretching our capacity out until our productivity hits god-like levels and wants and desires start getting fulfilled all over the world and during that time humans will shift into a mode of governance over the distribution of what it produces. Many will want equitable distribution and some will want a more capitalist distribution to owners that disenfranchise workers but technically own the robots and the AI super computers. A possible conflict could arise on distribution but I think ultimately AI will fulfill everyone's needs in due time anyway and that we shouldn't start WW3 over it.
youtube
AI Jobs
2025-07-06T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzw9cGFAUa0NvPHq-54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwX5-HdEgGghZf2LrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz63lnPkLm6WC0i2Lp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxvP8xpBTNwIDHFbB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwqcLsgymp4iSyU5ah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzlvhxz9wXoa0I4Nhp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzyqolekaOBm2Dm6E94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwgwj69FRdJrwDjkux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5fgtIqjOSlW1bGh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9TlehTCkzBt4yASZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]