Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t buy it. People thought computers were going to end all jobs, people thou…
ytc_UgzJGnxpY…
G
Elon's greatest fear for the future is A.I. he stated in the Rogan interview. …
ytc_UgzbywtiT…
G
Until AI is proven to be legitimate, then start investing on the competition. I …
ytc_UgwVD1fwf…
G
Maestro University is offering a wide variety of degrees with AI pathways. Inter…
ytc_UgzDOnJoM…
G
People trying to defend AI art is gonna tell is the same as the cameras because …
ytc_Ugz_nZBEK…
G
To be fair Lavender, you really shouldn't put things in your thumbnail and expec…
ytc_UgzdO_9mh…
G
Customer service applications? None of this would have started had it not been f…
ytc_Ugz43h9oE…
G
Another friendly reminder that Section 174, interest rates, and corporate tax cu…
rdc_oacpxpj
Comment
I haven't even watched the video yet, but oh my god, who thought "we should let these 80,000 lb death machines ride around with no human intervention on the road with other humans while carrying massive loads of cargo that could potentially come lose"
AI is NOT to the point of self driving yet, it wouldn't even be legal if it wasn't lobbied into our government already- there are so many accidents with Automated cars even BEFORE adding 16x more deadly vehicles onto the road.
Kids are on these roads, our elders, single moms, a hardworking family man, a road tripping family, a young couple, automobile accidents are already one of the leading causes of death in America, we don't need MORE death
youtube
AI Jobs
2025-09-24T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyBXd0Hvx0Dqy8mwHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1HFeWBedwd_kVYkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzeioZji-bI6KxJ77B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyO1qDPA9d_iJtsIt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy615q30ipQsE1IxP94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzesVC8KgFvpsRRO0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwhfnWKMRsdIAXFFXV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8uDBh6XmGJ7Hlb114AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwt2VqYlncrNssZDfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyxyew5NxFPgMaH5ld4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]