Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Strong arm the Dept of War? Yea ok. You don't need it or want it? That's why y…
ytc_UgwaqwYPm…
G
I got emotional watching this. WTAF are we doing here?!😳😖
I’m a sculptor in the…
ytc_UgzG4BTlm…
G
Humans are mostly driven by selfish genes which makes us cantish, so bring on AI…
ytc_Ugxv6EGF3…
G
Why do you let the car drive on surface streets? Don't be so lazy. I think the…
ytc_Ugwi94Ww1…
G
https://climateactiontracker.org/countries/usa/
Will recommend checking for cou…
rdc_gtdnajk
G
I love how he mentions that they can’t make something better than the ai anyway,…
ytc_UgygzV5Z0…
G
As a machinist I have thought about getting into programming but within the next…
ytc_UgyEYhi-T…
G
And how do we know this is AI?? These cars can be remotely driven by Waymo staff…
ytc_UgxIO-Cxa…
Comment
If the goal is purely safe transportation from point A to point B, be it freight or people, eliminating the driver from the equation is clearly the way to go. The Phoenix-Tucson run has been running driverless semis for years. Waymo taxis have been shuttling people around Phoenix-Mesa area for years.
I’m surprised people aren’t clamoring for the return of men to the coal mines, where automated huge machines have long since replaced most men.
And what about when the Star Trek-esque transporter beam eliminates the need for trucks, trains, and warehouses altogether?
youtube
AI Jobs
2025-12-05T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwWFL9S3sOGzPmL56p4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwDH4ZCO5zfWwgOnP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwr4vSCHxpOAFuRqD14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx4tVDh7nWWkME7GbZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqVF-d46WYuBFLj7Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwU2AOWwBvBITkIduN4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz3hSXo1kOUX0yUnG54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuwrTJ2emPnYiI37R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxrg8l-0vUc-x5QGIh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtohdZlIvp50aK6Y54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]