Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well said. Only some Nobel Prize winners seem to understand that. That speaks to…
ytr_UgyOAJdWk…
G
The only problem with AI is that AI needs humans more than humans need AI. Witho…
ytc_UgwAF1Nur…
G
I also see deep fakes as being closely related to fair use. Because when you thi…
ytc_Ugzj13IoX…
G
the sad part is, youre gonna be layed off and told you were replaced by Ai, but …
ytc_Ugy-yPkrv…
G
I realized how dangerous to humanity, AI will be 30 years ago. I started my care…
ytc_Ugzdz4atW…
G
Elon and I are going to establish a third party with the silence Majority around…
ytc_UgwwNyr24…
G
@DazieArtCounterpoint, at least for marketing specifically, is look at the backl…
ytr_UgzS3qTNP…
G
If you think China has a well regulated AI framework in place, you are out of yo…
ytc_Ugy43tr5v…
Comment
I dont think companies would just reach barely safer than humans and leave it at that because of the bodies that test car safety today which are created and sponsored by insurance companies who want to make more money. These companies will test and score self driving systems and rate them like they rate car crash safety.
youtube
2024-11-17T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz_tlf1zqGDRHfwuat4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugx8sd7xjbFoEz1jXKF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw98rTG2dQPLRO0UDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-CNCsS0PNg8bXRtp4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxIptcZX-37hlpoEq14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyUT6TZRyfObS_vE5N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxlCAtvSsaTd6fUbd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwy7Q0VSL0Z4bNziTd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxcJ6ksDlOryapG9XV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz3XLbWukP0Q77v_jF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]