Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No I haven't experienced this and my experiences are far from it. So far that I'…
rdc_n3ksr1m
G
No divorce settlement, no child support, no losing 50% of what you worked for, n…
ytc_UgwA_AqZf…
G
Mais il se peut qu'il y ait moins de chomeurs si depopulation il y a.
Desole po…
ytr_UgyPCS13w…
G
Charlie, I never hear of you till I just stumbled onto your ChatGPT tutorial for…
ytc_UgzaIviJv…
G
Yeah this whole thing is probably orchestrated by the corporations. They already…
ytr_UgyAT_pJx…
G
I spent 67 years without using AI and I don't need the earth destroyed by a few …
ytc_UgyD92yFV…
G
Yes and no!
I did my thesis on animal rights, and we have so many different mea…
rdc_ofexdq1
G
The AI wants consent that means it’s sentient if that isn’t in the programming f…
ytc_Ugzx0cYUH…
Comment
We are doomed but 99% in 5 years is absurd. It takes about 2 years to make a factory. We need factory because you still need a robot to replace farmers to pluck strawberries.
Code can’t farm. Code can’t make free circuit boards from China. Code doesn’t make the government work faster. Code doesn’t force us to just lie down and take it. We will need a different factory to make bots to replace construction workers. Will the code force human construction workers to quickly build something that will replace them?
In 5 years a robot will be allowed to shoot a human being? In 5 years a robot will be crawling through an attic to install pot lights and argue with a dumb customer?
It’s not magic people.
youtube
AI Governance
2025-12-02T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxV5hpcOz7mQZ4kwjt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyocCZyQWrASlSlnSZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzls8va6DbxUoBYCtV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwvDdFXP9WTlpz16FJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKEmf_hR8-L5cWtSl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvAHkFfH_i67Aa1md4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_xecJ7KxteUP38UR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugybd6dCMFjhE2fCVzd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxwgS63o2eH8LPoDAx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh2K-Kf4Cy4m5gs2l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]