Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's another AI, but well, after ChatGPT's concerning first few days of launch…
ytr_UgwFAMd4z…
G
ITS MATH AT SCALE APPLIED TO OBJECTIVES.
its accuracy - its about what humans p…
ytc_Ugx_cVaBA…
G
The robot just need an open hand 👋 slap to reset it .. i work with lots of mach…
ytc_Ugw4QYlR0…
G
thats mods being returds cos trust me a serious artist won't be a reddit mod 😹 a…
ytc_UgwoFjSTK…
G
Don't call them "AI artists"
They did not create anything. They just prompted a…
ytc_UgxwyHqYG…
G
Kids are already brainwashed and hypnotized on their iPads 24/7 cause parents ar…
ytc_UgzXF96cX…
G
This needs to be stopped. The people in charge of AI can hardly be trusted to va…
ytc_UgzkLe03N…
G
I wish for all the success for you and I hope the algorithm actually does its pu…
ytc_Ugwupc2na…
Comment
The main reason that artificial intelligence (AI) will take far more jobs than other technology is "human intuition". In almost every job, things go wrong, and the worker must find a creative way to fix the problem (like fixing a machine, cleaning up a mess, etc.). Because of this, an actual person was required for many jobs, even jobs that could be mostly automated. AI specifically seeks to exhibit human intuition, and in many ways AI has been successful at this. When robots can handle messy situations, then human workers will really be unnecessary.
youtube
AI Moral Status
2020-01-26T00:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxvRmbO-776mbs_gBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlYdKiGzbgH1qziiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4y0hErdGKG11ait94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynBrwiqp2SZH5Nlh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjaJSOuKzLKV58NYx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw56vZ9uqn3TBMnEOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1PzQusXdtijEHd5p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzdH3TKRSU0iKLy9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzatEFpalIQlbEj5pF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxZoIni1WZneRueg5t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})