Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Many non engineer jobs were gone because of automation, it happened for years. N…
rdc_m6ybls7
G
48:39 I disagree with this take; it may not have taken long for chess algorithms…
ytc_UgyNM8wkb…
G
Those Greeks, such sages they were with their admonishing myths, though they see…
ytc_Ugzub1364…
G
Barely started the video but the thing that confuses me, as someone who has only…
ytc_UgxmEpMY-…
G
Our emails for tickets are automated now. We have 3 minutes post call to read th…
rdc_ofi40ua
G
I’m pretty sure that the robot can’t see things that an experienced driver would…
ytc_Ugy1VMBLG…
G
probably we should revise human-ai relationships, not like human-human relations…
rdc_l3mxp7d
G
Sure, but won’t corporations argue that it’s a single AI agent that’s serving th…
rdc_oh2p1i3
Comment
You know. You could have saved me 16 minutes and 42 seconds by saying this entire video was a pro-AI ad for some bullshit called brilliant, that also lies to your 24.4 million subscribers about what machine learning is, what AI is, and where we are on AGI. Good news folks! We are nowhere near AGI because not a single ML variant is doing "self" directed learning, as they all require human input to do anything!
If folks really want that "oh no the AI we treated poorly is killing us" fix, go read "I Have No Mouth and I Must Scream" by Harlan Ellison (with context that Harlan Ellison had...issues is putting it lightly). Otherwise, this video is a waste of time, and y'all should be ashamed at how your content has been spouting lies to sell products.
youtube
2025-08-25T00:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyArbs3U6BE08ZQzdh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrhIEOydrR-kmyquN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwB-TGAOuXHHGCMhJ14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzl-c0agTiTOnMlFL14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymVRa_aMwFL1Cuys94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyT3XLigDd-fRv6_O14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz8jJHl7_N9L_Fwqm54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgBXMqA10X7vw7bO54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz5yHtHcqqedy_lCiN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2WQrbKk8YUjVY8FB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]