Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He made the choice to post a comment, it was an easier choice than listening to …
ytr_UgwTjTYPF…
G
As a professional driver (trucks); yes it is, theoretically possible to have sel…
ytc_UgyW8fx9J…
G
I oftentimes think about it. What If we train an AI to be addicted to a virtual …
ytc_UgxOGLxhw…
G
Makes a 100million dollar drill rig while solar panels are less than $150 for 4…
rdc_ogsdhci
G
The AI can be prompted into saying something, that doesn’t mean it’s the AI’s op…
ytc_UgxEbr5TX…
G
AI has got no benefit for common masses ..infact it's like tool for elites and c…
ytc_UgxtfJiE3…
G
i JUST came back from an ai chat and this appeared
never question me about them…
ytc_UgwurNANu…
G
There is a profound difference between hyper-intelligent AI and artificial aware…
ytc_UgyXplVL5…
Comment
I've been using AI (ChatGPT) to assist with Windows c# programming. It's like working with someone who has a lot of knowledge but no common sense. Employees will begging thier workers to come back. Plus if everyones out of work, who will buy the robot made goods? This bubbles is going to hurt REALLY bad. AI is not what they are advertising it to be.
youtube
AI Jobs
2025-10-08T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwmoAyw3oTWcJed0Cd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugykstqx-j2OWNj-GF94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyC_3ZDNWYgDYOR9tl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyZWabh8h1J48f380B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyO0EW7iWN8G_AO95V4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxagdDz82V1JyqYlFZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvRTOCkUj4M7mayTB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxAG4rJ0igHAK2oXyZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxugS1X7RF1smwel814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwvy0yDbOksZr4voQB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]