Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah good luck hating on ai, atleast learn from past you know what happened to B…
ytc_UgzSB9dqr…
G
For being smart these idiots are creating the worst enemy of our future. Serious…
ytc_Ugzjv6s5A…
G
tetrapack24
Humans (or any other logical being) doesn't have to create such thin…
ytr_UggAeEyGO…
G
Self driving cars will be very helpful, and will do wonders for peoples drivi…
ytc_Ugz9eHjSF…
G
That's why you make you follow the law that says three to six car lengths behind…
ytc_Ugx8Kcl0b…
G
With zero minutes thought —
you easily discover that your AI is a meaning from …
ytc_UgzY9g0k9…
G
That, and also that large language models reflect back the input we (as a societ…
ytc_UgzA-gebz…
G
We don't know how to make AI safe...because we don't know how to make human's sa…
ytc_UgymQejqP…
Comment
There is no need for these things, they are super stupid! They won't reduce accidents unless it is in a closed system with only self driving vehicles that talk to each other, with no animals, no people walking, no bikes etc. Meanwhile they will reduce jobs, and in the end if you don't want to drive your car...then take the transit!. Sooner governments clue in to this and not allow these things the better.
youtube
AI Harm Incident
2018-03-23T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyBCRn2IMzSnm26o6t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw_YLyqmyO8jrFbq3t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgxTNDuFrvpXaO2J7K54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyT9erG4D0vG72BgEl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzE0Q-sD4etuwQMMod4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgxJYz7fniZehiT1Klp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwYDdnL4FjzfKW8QDp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzfEAy_t5LjQcK1nB14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgyVpmKUOg-LEZ1EzMt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgyCnkacnYHmMddOB454AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]