Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No. That's not really how it works. The companies weren't negligent, and there's…
ytr_UgxQQ0Cas…
G
1) First Robotics Automation take over automobile, Mobile manufacturing etc
2) A…
ytc_Ugwc4JX2e…
G
NO, NO , NO... What people call “AGI” right now is mostly marketing. LLMs and “a…
ytc_UgyAHji4y…
G
A very good example is the predictions of cardiac arrhythmias which doesn't even…
rdc_f1eh26o
G
This remind me of a robot in call of duty 3 or 4 when it blows up with knifes…
ytc_UgybSCsFK…
G
11:16 well no one's going to have money to prefer another human do a job for the…
ytc_Ugws6mCfw…
G
A timely topic, I've been getting so many dumb AI software ads on YouTube lately…
ytc_UgxhRtvQ6…
G
I’ve moved away from having ai produce code and I’ve moved towards just using ai…
ytc_UgyBz7POm…
Comment
People will still be needed because AI ultimately has no clue what it’s doing which is why occasionally AI makes literally epic mistakes that no human would make. And work will still pile up. Our organisation makes a certain amount of work. But if each of us could make 5x more we’d definitely choose to do 25x more work and work force would still stay the same.
youtube
Cross-Cultural
2025-10-06T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwJJtJc2d5QmFHwiMx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8ELXlRA6b3sf8Q1l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuMoz6IW907w2KIM14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyocWHgULI0ftcH3Wx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwv-Mu33flRgqOJ-AR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxrj-7O9cGtJLihZaR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzUbrYprUADRDjoDsh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWe6sicbNfMT_-lXN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxX_kXN49iIgCj7CMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwWFt47WYSpFSbP3oJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]