Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So stealing is bad, but what if I only steal a little bit? Is that still bad? Is…
ytr_Ugyvb8j5B…
G
What AI Should be: Hehe funny donald trump robbing king supers in a mario costum…
ytc_UgzxIMspT…
G
Small note. It’s not that chatbots change throughout the day, it’s that they use…
ytc_Ugz7On4Ft…
G
How the ai smarter then us when they go off what we discover and put into books …
ytc_UgxWEN8RI…
G
Well, self driving will be much safer that human driver. It does not get tired o…
ytc_UgxbsV8Pm…
G
What I just heard Elon musk .. I have heard once before. Someone I know is Psy…
ytc_Ugxk_1ixl…
G
Every AI-chatbot customer service experience I've had has been basically: "f off…
ytr_UgyGa4H-V…
G
Several issues with your arguments. Im not rage baiting, but pointing this out f…
ytc_UgzV5LCTt…
Comment
robots can;t be nurses, or even nurse assistants, much less doctors They cant teach worth a damn, or guide hunts or fishing trips and maybe not even camera safaries.. They caan't be managers of boarding hourses, repair houses, or cars, Robots can't do research of some types, at least. Therre are many jobs that require too much judgement and/or interaction with people for AI to do them in an efficient manner. However, 90% of people are likely to die off.in the next 10-20 years.
youtube
AI Governance
2025-09-04T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy79L6i7DlwsSVDpkF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzfwwy_Zl2mG-Osyi94AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx7sD417lCfm7e1tM94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpNsxF-EuYnUye2-R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhmK1B2SOIehHgTxh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwf4y3Sg8wemDJBSrl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxhEBnQcsP8cHveG5B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBfWMP2VR1NQfMgSZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzObnjVR4u-9V2mwTZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9vCgT7g9P2T2hguJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]