Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@0:45 "The only connection you have with an AI is the connection to your own stu…
ytc_Ugz6ehFIh…
G
In case they don't mention it in the report: He said, though I can't remember wh…
ytc_UgxUPOjNO…
G
The way I see it, you are actually helping AI scrappers....
We are in an era of …
ytc_UgzNy-hfZ…
G
AI ek massive tool hai, jaise pehle electricity, computer aur internet the.
Unho…
ytc_UgxONS00Z…
G
i don't know how they can't see how ridiculous the argument they're making is. s…
ytc_UgwxeLd5b…
G
I cannot stress enough how much I hate this.
I cannot stand having to deal with …
rdc_m5lzn2b
G
You say that like we want an AI revolution. The only people who want this are te…
ytr_Ugy7-tO-Y…
G
sam there is a website called civitai and they still have loads of ai models of …
ytc_Ugz3a4jru…
Comment
The staged 2016 demo mentioned at [05:42] is the ultimate example of how "Silicon Valley marketing" can outpace actual engineering reality. While the world focuses on the legal fallout, I’ve been analyzing a deeper trend on WORLDONIFY: how the transition from multi-sensor redundancy to 'Vision-only' systems is a cost-cutting gamble that shifts the risk directly onto the consumer. It’s a fascinating look at the 'Move Fast and Break Things' era hitting a hard legal wall. If you want to see how these tech-governance battles are reshaping global safety standards, you’re looking at the epicenter right here.
youtube
2025-12-25T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxikh5LUCycf_vzKIh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoCiAf1UM33UbttvJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyyRryq-o4ScH2C1l14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRT3nNKg4hv1MHv6x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5mjgQrgr-SAeERg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3AeyWPxSwA_A0IRl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwRVSlfLRlKVHH4_MF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy4h8xzj1CQ-HuoVfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPTj2HnjfBtKpWpjV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJTAAVe5ZP_TRsXut4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]