Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
elon musk is doing this silence thing on interviews ALL THE TIME - whatever he s…
ytc_UgxgOjkSt…
G
He isn't an artist, he is a curator. The ai is a machine that spits out random i…
ytc_UgzcC3Edw…
G
Judd offers the only hope we have of surviving the AI risk, which is to invest g…
ytc_Ugw0ciiLP…
G
Awesome video you've definitely got yourself a new subscriber. I agree with so m…
ytc_UgwN3u-zT…
G
this is like the least interesting problem with AI, there are so many more impor…
ytc_UgyyLR7Ct…
G
I’m not the only one who treats ai like it may one day be sentient? Good.…
ytc_UgyyTC-HT…
G
@realdavestrider bruh... "ai robots being made to sit in the back of the bus" Th…
ytr_UgwmaBYnq…
G
Writing code has consequences. Computers and AI can’t take responsibility for th…
ytc_UgxzFpFjz…
Comment
Would you consider that the robot troops depicted in the Star Wars prequels were just advanced calculators or did these fictitious characters exhibited consciousness?
And what's the difference between a robot machine is less aware than a bio-mechanical "robots" as depicted in the blade runner movies?
What about a combination of the two.. or like in some sci-fi stories where biological and synthetic systems are combined?
youtube
AI Governance
2025-07-16T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxe3i3-I84L1v6WkTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgyNe47VIeo0z32NiuJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw-5BCPmTL8DWyYNYp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz8He1xAsHAo7lJgxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyK13VTVE6tRJwxfJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyE5wX3cFfw6_X6IL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgyZlF8dluKheFFNkeh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgyeJLaFMWyhNas_XaR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_Ugzz8WYjxBOPN_6YBIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxWXmn27NBJ1sAWi5p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}]