Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm really going to dislike everything AI. This is absolutely unwanted. Google i…
ytc_UgwaZl8tx…
G
Is Ai a byproduct of humans themselves- something smart enough to think for itse…
ytc_Ugw4jKbb4…
G
@19:11-.-Okay, as I said on another comment, I'm assuming you're going to elabor…
ytc_Ugw9_hKgf…
G
1:03:50 onward: I'm sorry Master Hinton, but that might be highly confusing for …
ytc_UgxQl3Qd1…
G
Good, we won’t have jobs, therefore we won’t have money to buy the goods and ser…
ytc_Ugxs5lsJK…
G
Enjoyed this episode, in particular, looking at the biases of different AI syste…
ytc_UgzxJtjg9…
G
Anthropic is way better at "safety" than Musk's Grok and no worse than OpenAI. T…
ytc_UgysDTfOE…
G
AI can do everything. Except it can’t deliver packages. It can’t Buy packages.…
ytc_UgzoovAkm…
Comment
Factors I think you missed:
- some riders may share vehicles, especially if discounted.
- once self driving cars become more prevalent there will likely be much less traffic, which can be responsible for much of the driving time in dense cities.
- also, self driving cars will probably be able to drive faster in some instances.
- the difference between trains and individual vehicles is always going to be a trade off.
- You mentioned the cost of driving with a self driving vehicle versus owning one’s own car, once the self driving cars become ubiquitous the price will decrease substantial making owning a car more expensive.
youtube
2025-06-24T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzRaZobZ90HdH8tHhl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNdu5z0-knTwdJbhN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQevq-cRUTi8AAiHt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfVpdQKxakicc_mqp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyCLtr2HXzJBIrllRh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwNWLDoJYwqSqBGVE54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzItZgV7-8ALx1CpUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAGhhfqPj-CpM6-HF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3RtIM5OI_qLNpu-Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwlb85Lejf73FstI7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]