Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If androids, or AI's ever get to the point of self-conscious state, they deserve…
ytc_Ugwu5KhSa…
G
As a software engineer, AI could honestly wipe us all out... Wanna know their pl…
ytc_UgxSimrtY…
G
There are already AI podcasters or single person talking on camera with a very c…
ytc_UgyBBXvoX…
G
Yeah but using AI correctly means uninstalling it so I’m not sure how that’s rel…
ytr_UgwE3rd9A…
G
As for the tar pits, the AI companies don't care if their crawlers will take a f…
ytc_UgzbpPNrA…
G
Robots dont deserve rights there robots there not real there basicaly slaves but…
ytc_Ugza4NIVG…
G
AI is far more dangerous! If anyone does not speak the real Dangers of using AI …
ytc_UgwFmOzPL…
G
The only thing human has this time is creativity, no AIs have creativity, all AI…
ytc_Ugwlt89Ga…
Comment
Honestly, that's why I don't like working with AI. AI is mostly about dataset selection: garbage in, garbage out. So instead of coding stuff in the majority of the time, I'm mostly filtering data the best way possible followed by hours of waiting for the training to end. The only way AI is nice for me to work with is simulation based machine learning, because then I just simply generate the data with simulations and filter them out via some ranking system which is way more fun than labeling thousands of data points.
youtube
AI Jobs
2024-06-15T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzeD0MQSLyHrWKbQQp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxB8EzvoHUkZtyZP2V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz220TZA-chxFBGh8B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_zCNV5ak9El6xUA14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxcg-s6bILtsH5Q1kB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0h2fPXT7SB5LCAMR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0Y0ShEppm3MoR5oV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6Ad32WC0UeTsXP-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz__jQon7ze4t_xerF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRLQyiw0eRP0o31h14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]