Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You should be more concerned with what humans are going to do to ourselves and t…
ytr_UgxDk5iqr…
G
If you don’t believe AI will have a significant role in Hollywood within the nex…
rdc_ocszr2j
G
Well, Bernie doesn’t want to make anyone work that doesn’t want to, so I guess A…
ytc_UgzxSv4S6…
G
If AI was a human child; you wouldn't educate, socialize or build knowledge of t…
ytc_UgxfLrHee…
G
Before personal computers were a thing, long before the internet, people working…
ytc_Ugyu4FAyF…
G
am i a bad person if i sometimes use ai as a reference? its very few and far bet…
ytc_UgxvTM4pU…
G
In the words of dark souls enthusiasts, git gud. I’m mad that other artists are …
ytc_UgzlE7SmU…
G
They probably already can run 90mph without a vehicle..now they have guns, smh 😲…
ytc_Ugy2nZLId…
Comment
Yeah the problem with self-driving vehicles isn't the bugs or lack of parts that can ONLY be provided to you by Tesla.
It's the other drivers. Something the A.I. can't do squat about. A self driving program should always include a safe protocol procedure.
Being it deploying airbags before the impact or blow the other car out the road. . . which would be fun.
But yeah there's still too many red flags with Tesla products, i wouldn't get involved until i see two or three Tesla brand mechanic shops? Tech shops? whatever.
Until i see those being stream lined, i wont get a microwave with wheels.
youtube
2022-07-01T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxdP0EGbd2Wl3jPYSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgycVT7Ucpf_4N3qteN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaAsvJZU34Y4K5Pih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcTz4z5IvaS6M7qOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw1OnYwOnNzasPCFD94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKA6rKlMNK4GWNGe94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSl1-MCkydfjPmphd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzE8KaJoBTqz84QerB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCfhi0vpWVIrcAakt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgxVteF9r2N6NjwU7GF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]