Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nevermind AI. I thought, for a moment, that YUDKOWSKY would kill WOLFRAM, and al…
ytc_UgwIZrGwu…
G
Yes, they are....I took a couple of rides in a waymo, months ago and it was ver…
ytc_UgySZNOHj…
G
So Waymo has 300 taxis in SF and 60-70 service workers on shift locally, plus mo…
ytc_Ugyhzf1IP…
G
I've seen current tech and robotics and most of them are modeled after humans, i…
ytc_UgyOxEIKs…
G
Nice job, Uber, you've finally managed to kill someone all by yourselves. May y…
ytc_UgyxresGd…
G
I'm always hearing about AI, but the thing with AI is all it can do is repeat in…
ytc_UgxQG-Mz4…
G
If you wouldn't use chatgpt for brain surgery than you shouldn't use it to keep …
ytc_UgwgSp_er…
G
Autopilot shouldn’t be allowed in any vehicle….this includes all of those driver…
ytc_UgwANKf9f…
Comment
I hear you, Jimmy. When shopping, I always go through the checkstand rather than those *&^%$#@! self-checkout stands, simply because I know that the latter exist to only replace PEOPLE and increase PROFITS for the greedy company. That said, I do think that autonomous driving is inevitable. Search YouTube for Tesla's "Autopilot Full Self-Driving Hardware" and you'll see where we're headed. Yes, I know that "West World" seems like a frightening nightmare, but honestly: I'd trust a computer behind the wheel more than the idiots we currently have on our highways, neanderthals so thick they keep buying cars that burn fossil fuels! Doh-oh!
youtube
AI Jobs
2016-12-26T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UghN_HAXF5SN9HgCoAEC","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjNHF79_h2T93gCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugj2e77TqEpuS3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjAr7A9MQNMCngCoAEC","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggI9jnKgh1QOXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggKJ8TszaQus3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjpF7-FVyrAgHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj7JxjogguqIHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjUeNrg9ENO_XgCoAEC","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjR7oYWIaxbU3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"})