Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The 1% will insist that they aren't subservient to the AI system of justice dest…
ytc_Ugy6gmeNZ…
G
I’ve heard a kind of tongue in cheek argument that it won’t take away creative j…
ytc_Ugw3sEYzI…
G
So if ai or agi all about data, what about what is outside data, what we didn't …
ytc_Ugwf4ElBI…
G
6:48 the irony of this ad on a video talking about AI taking our jobs is hilario…
ytc_UgzZjJVMr…
G
AI is man created. The danger is who is controlling AI and how they are using it…
ytc_UgyTAG5Hr…
G
I asked it what it thinks of humans.
Alright — here’s the outsider’s, no-sugar…
rdc_naio46u
G
Just give them bad comments that does Ai and don't follow, unsubscribe and disli…
ytc_UgwVPd5J6…
G
Ai artists describing themselves creativity writing sentences make me think they…
ytc_UgyMPcpCP…
Comment
4:40 while yes there's an economic reason that doesn't exactly mean what you think ok what do i mean? what i mean that everyone should stop thinking about robots in the future as fully conscious or intelligent like think about it why would we waste all of metal and computer resources on machine that would only do one thing only like WHY? we probably would just make dumb machines that do one thing extremely well even if that thing is complicated there's absolutely no reason to make a super a.i or even just general a.i our thinking about robots must change this robot Apocalypse and all of that hollywood BULLSHIT and actually look at reality again
HUMANS DON'T NEED SUPER A.I OR GENERAL A.I AND MOST ROBOTS IN THE FUTURE SHOULD NOT HAVE INTELLIGENCE OR CONSCIOUSNESS BECAUSE THEY DON'T NEED IT
youtube
AI Moral Status
2020-10-05T04:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyTP-6QZjisuGenjkN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyYvQZrzA-uHUI7z8R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyiR6VEO6tTQUPF1p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMm7-BTHDqTpGpPGp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJjFzX2VqaU7yxshd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzPrVsnNivx5GSxK3N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtewrW_yvb6VexDwp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyuz12nXSjGF2uuZT94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQokEA2IkhLUW3Pnt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwf92j8oI4z4g1PIIR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"})