Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What a horrifying prospect encountering a massive driverless truck cruising down…
ytc_UgwK8rllu…
G
If A.I. is being deceptive today....what do u think the next 50-100 years is gon…
ytc_Ugyyd3xcH…
G
The guy is an educated moron. AI is a glorified google search, how the heck can …
ytc_UgxY_uRga…
G
Not “ Could “ but “ Will”. Give it 5-7 years and working class jobs aside from h…
ytc_UgzHJrBhj…
G
The competition between countries is a lie. The banking cartel controls the mone…
ytc_Ugxtma1Wc…
G
Wrong... it's when a.i gets so advanced it triggers all the nukes... it was dist…
ytc_UgzSty_jy…
G
AI is useful for crunching data and backend stuff. It is becoming invaluable fo…
ytc_UgyBgpLgB…
G
He's telling you the truth because they are smart and can possess different ki…
ytc_UgzwsLt5F…
Comment
This might be a dangerous road for human drivers - but that seems to be mostly down to people going too fast and over-estimating themselves. I don't think this road is particularly taxing for self-driving cars though. They usually don't feel tempted to go too fast.
I think the big challenges for self-driving cars lie elsewhere entirely - in weird intersections, bad or missing street markings, confusing lane changes, bad behavior of other drivers, and visually challenging conditions (think night driving in heavy snow fall, or something). Things like odd angled crossings of more than just two roads - or unmarked pedestrian crossings - or nearing a crash-site on a highway, etc.
youtube
2022-07-07T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxdP0EGbd2Wl3jPYSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgycVT7Ucpf_4N3qteN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaAsvJZU34Y4K5Pih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcTz4z5IvaS6M7qOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw1OnYwOnNzasPCFD94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKA6rKlMNK4GWNGe94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSl1-MCkydfjPmphd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzE8KaJoBTqz84QerB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCfhi0vpWVIrcAakt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgxVteF9r2N6NjwU7GF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]