Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ art, in basically every way, is a secondary need. It’s just the easiest to ig…
ytr_UgwpKJDG2…
G
Of course autopilot didn't "react" its job is to keep the driver safe, it's not …
ytc_UgxJXdczZ…
G
honestly as sad as it is but i think ai has brought artists together because we …
ytc_Ugxb7ESMo…
G
The creation of 3D illusionistic, 2D art, is a biochemical algorithm, that manki…
ytc_UgwNx8Elv…
G
Thank you! … I’ve noticed that the response speed on your video for AI is alot …
ytc_UgxUjc59x…
G
Oh my God!!! , please don't do that .... she's Vicky .....the robot .....the mis…
ytc_Ugx-6PXtz…
G
Nobody knows basically anything about how the human brain works.
The failure of …
ytc_Ugx62XveX…
G
Sorry, but 3 seconds or even 3 minutes are not enough data to train a voice clon…
ytc_Ugxq0WLxv…
Comment
Jobs everyone prefers humans over AI and machines: surgeon, hairstylist, nanny / babysitter, manicurist, "human" marketing roles, banker, financial advisor, retirement services manager, human resources manager, most medical roles (dentist, nurse, pediatrician), pretty much every construction role (who would want to live in a house slapped together by robots, or work in a high rise AI designed? Not safe!), teacher, professor, airplane PILOT, pretty much every commanding military role: officer, commander, etc. There's a huge trust curve that wasn't considered during the interview. Most human beings won't trust AI enough to replace all of these jobs - and many more.
youtube
AI Governance
2025-09-07T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2YTx7PKydcEb4R7t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugw0uAlfZIqE3qhoZsd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"hope"},
{"id":"ytc_UgwA3ly0M60QYkHgZuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyWe06-KjXHnQnC_md4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgysO94HSp7xV15tk0x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmbFgrqoeqWCk9EaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLfi5EfkzgtFQQ_oB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNI2cH7vqZlYB0NA14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-DTBZpo3c_xXs_tJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgztMPYaWbj51VqY4Ml4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]