Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Klarna is god awful I had a store card app that they bought over. Everything jus…
rdc_m2am559
G
AI is NOT ready for use.
Im flat out refusing to engage with it in any way.
Wh…
ytc_UgyLfH0IG…
G
Of course, AI reads minds, intentions, history, and that of surrounding folks. C…
ytc_Ugx-4CVgL…
G
Musk's predictions are always way off. Musk predicted Tesla should have self-dri…
ytc_UgyYz-bIb…
G
This would be a great time for anyone to binge watch the series "Person of Inter…
rdc_icgcj3r
G
I don’t mind AI for things like horror and sci-fi but not when they try and make…
ytc_UgxB6Wktq…
G
Looking at this podcast, and thinking about Geoffrey's responses I realised, he …
ytc_UgxMTIVX_…
G
Companies will go bankrupt...that AI just accepted a return for a defective prod…
ytc_Ugw_YlZIh…
Comment
I think you made a mistake in not showing the accident location.
Elsewhere on the internet the pictures show that the location fools people thinking it was a proper place to cross the street. There was a brick walkway that gave the appearance that this was a valid crossing point. Why that walkway is there is not clear.
Bad infrastructure will mess drivers up regardless if they are human or robot.
youtube
2018-03-21T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyBwr53QSrFwsgZ6Fx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzFdYKkc3EPrgIw4QB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxyqDNXW822gUmjenp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwILlhd8deTJLVCDhV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyAX9tJXO7m_YMoOZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmTB7eP-BaIKS3dZl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyv__Up7HC5I2xbfa14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdjWT7OaG50E2OsJR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw0hEIGcJirB1lT3Ut4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw92ZW-Q11YB0JOI9Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]