Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you would know what I'm talking about if you say the rick and morty robot…
ytr_UgjW7NZxn…
G
If you understand how AI works... It is pulling information from vast Databases …
ytc_UgwnTHuyq…
G
Most losers in AI futures are Indians, as they work in tech companies and are th…
ytc_UgxEHTaom…
G
My guess is either:
A) Interview incentivizes bullshitting and lying (e.g., no …
rdc_oae9uzh
G
Thanks. All the best. God bless you and your family too..be healthy. Be strong …
ytc_UgyXKk5wc…
G
The issue is that people are putting art into the Ai and the Ai steals parts of …
ytr_UgxOQSTn2…
G
Navjot Singh Exactly. What I am saying is, the argument regarding AI replacing r…
ytr_UgyoPvNlw…
G
W O W...this modern ai world 🌎 jabber jabber jabbering; not having a human that …
ytc_UgxgJCnGx…
Comment
Worked for GM in the late 2010s, they kept telling us they'd have full self driving cars ready for purchase by 2019. Everyone thought I was a fool when I kept telling them there was no possible way us or anyone would have them other than in very limited and restricted areas, and definitely not for consumer purchase anytime close to 2019.
They would literally laugh at me and tell me I'm and tell me how close we were. I'm like look at the current level of consumer tech theres no possible way. Your phone can't even do what it's supposed to without glitches at times. A self driving car glitch means someone dies.
Until/unless we have actual AI that can think at least as good as a person and react to real world situations as good as a person, we will never have it. We'll get 99% of the way, but that last 1% is a huge deal. Even if they're safer then people driven cars, people will not tolerate a robot causing deaths. They need to be 100% reliable.
youtube
2025-07-30T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgweanYgi1cMkl3kXXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8s8wfF715qAxELwV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwSYprJ1vU9YDlWsad4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaK1g-0gLPm_aP8iB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLYWZOn5p4Rb5Gn954AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz_DZ-PxXbQw6TxiOF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxbApOyaC1noq2hm-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVWVL5uU1qX6k0GTZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2qhCzYuHBIlpHYkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4sh8eeWAbooWilSl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]