Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't have emotions and personal experiences humans have to create art. You…
ytr_UgxqBte6v…
G
The issue though is AI in its current llm forms Is not profitable. So it really …
ytc_Ugwa9sHAi…
G
Robots, AI, all of it. Maybe the government should start thinking of the end res…
ytc_UgzGr_El4…
G
I used AI to create AI solvable problems. Then once management was happy that I …
rdc_ofhtufc
G
6:07 atp it sounds like the believer is just regurgitating the Athiests respons…
ytc_Ugw0Vc_3J…
G
AI will kill us all due to human nature. Humans will push and push to be more po…
ytc_UgwylY94m…
G
Thats a stupid comparison. Horses were never the intelligent beings replaced by …
ytr_UgyrIkmrC…
G
Put a law in place RIGHT NOW! 60% of everything AI generates goes to support hum…
ytc_UgzlrZVNk…
Comment
These self driving cars are an absurd idea. They're not cost efficient and won't reduce road use by much. Public transport just works far better in cities where it's properly implemented. The biggest problem is the rsistance to publicly funded infrastructure projects in the US and also resistance to the idea of public transport as most American's have never experinced that level of convenience and efficiency. It's automakers who have created car dependence in the US right from the early stages of car ownership. These self driving cars are simply a further developemnt of that. Ultimately, automakers have convince Americans to give over a huge proportion of their paycheck for the pleasure of sitting in traffic on ridiculously long commutes. Self driving cars will only contribute to this problem.
youtube
2025-08-01T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzvuPIG05YX4saxr3p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAM2iKt9ticue1hU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-JNiN8G6hCBqhKqZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiFl9R2z6f9u_agrx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy87XmIswCusq4NmIt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPT9vYi1yowEYNnW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlMTGzSQfdvussC3F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwIyGq_w0DLtJFZISB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRLXl0f38uWeHuJjJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyj0VG4M_o8o2Oe8dR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}
]