Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I never liked the thought on unregulated AI development and, apparently, neither…
ytc_Ugw7XUD5Z…
G
Not only Supreme court.. Ai should "rule" the whole planet.. when i say "rule"..…
ytr_UgxiDG6C9…
G
I used to have such high praise for your channel. Conflating Tesla’s Autopilot w…
ytc_Ugw-ZdR--…
G
Through embedded nanobots we'll also be linked to a global AI. Question is, who …
ytc_Ugxu7mTB2…
G
I dont know if some1 read my commnet, but still writing lol... Enywhay im 40year…
ytc_UgwpWNrG1…
G
It's important that we share this type of information. While we have the chance.…
ytc_UgzDQ7Hb8…
G
@Pugson_offical By design, the Ai can't have creativity, the image can be inter…
ytr_Ugwbbzafz…
G
Out algorithm has determined that you will commit a crime sometime in the future…
ytc_Ugyw1hXiN…
Comment
Driverless trucks. Whats next? Pilotless commercial airlines? Engineerless trains? What could go wrong? A disaster waiting to happen. Technology is only as good as its input. Jets still fall out of the sky. Trains still derail. Trucks get cut off. What about winter driving? Snow,ice,heavy winds. Way too many variables to overcome. It may sound good in theory,but if its man-made,it can still break eventually and fail. The more complex,the more things can go wrong. You still need a human element.
youtube
AI Jobs
2025-08-27T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgygB3QH1a4McS7zyMt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyJ9rozsr0U7zM4X3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzkCu31Naz8vltkRCx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxzNjkvWtvd_N9hlzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0NHDdfGnWACC4y1l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNFnOa5a2B1gLpNBJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzq55e9cWDQZ8xoxEh4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzvkvk_kOOLmcH_Xld4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxKRf2DAAMsfO1FCqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwB8tFgnMirD_6cCZ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]