Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please hardwire into the core program the 3 laws of robotics
1) a robot may not…
ytc_UgykTtgu3…
G
Rn ai is just a gimmick for art, it doesn't allow you to create your own thing l…
ytc_UgwVwjkVX…
G
I like everyone who talks to them seems so uncomfortable. I welcome our robot ov…
ytc_Ugwh0JNP-…
G
So when are these things going after real crimanals-Bankers, news papers that tr…
ytc_Ugzg1Nt4X…
G
Why would you ban AI that is able to satisfy fantasy of so many people?
Why woul…
ytc_UgyGa0de8…
G
People worried about AI taking jobs have the wrong mindset. If companies can mak…
ytc_Ugzmmtgax…
G
Real time projects are different,they are dynamic,robust and lot of thinking is …
ytc_UgzoAzsXl…
G
I mean why can't we spend our time enriching our lives and cultures instead of w…
ytc_UgxCCyp9h…
Comment
It is still rather impressive for a new technology. But it does still feel like that owes are being uses as the AI trainers for the system. And for myself, while it still isn't as good as a competent human driver, I don't see the point as the humans still have to have oversight, and humans are TERRIBLE monitors. We get bored, distracted, especially on long dull highway trips. Autopilot in places is an example where 90% of the time, they do actually relieve workloads from the pilots, following detailed navigation, but there are radio beacons and GPS and detail for airways where it works well. Cars have almost none of those things in a regular unmodified road. The autopilot and navigation relies almost completely on GPS and poorly updated maps.
youtube
2025-10-07T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyvioQAt8s4wERK0kd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyNu2TWxXgcLx90Xtl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw8AoPGsOFJkA37_Pl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwqINwrswsEg2uSZRt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgybwvuZCVMWkAGNTTx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},{"id":"ytc_UgxtA2JWB0rveASe3f94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_Ugyq8TKlHe9hI5bpUD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxJX70E4IK9jRHAvq14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzJGGodmUl_6HJjUKF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy_81W7O44o18eDinh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}]