Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So would be a point where machines will not need any human around. Game over.…
ytc_Ugw60gXSi…
G
This goes to show, do not believe every thing AI tells you. Moreover, if someone…
ytc_UgxIXKRSd…
G
Regardless of personal opinions on AI, the NYT losing this case would be best fo…
ytc_UgzCspToO…
G
See how that robot didn't have any motion effect by the gun .... Wow!!! We're do…
ytc_UgxosiCSv…
G
Haha, fair enough! I’ll be here when the 7 days are up—ready and waiting. Stay a…
ytc_UgwV_P2Wg…
G
I just saw a vid of students from Purdue University, who programed a machine to …
ytc_Ugw7mJzLQ…
G
No, you don't get it
I am SPECIFICALLY trying to get AI models to remember EVERY…
ytc_UgxY7V_Vi…
G
F*ck all this bullsh*t talk about "AI making our lives easier" ... life is suppo…
ytc_UgxHy080s…
Comment
I wrote a post about the dangers of not having a person in the driver's seat of a driverless truck on my Facebook page. I do see how technology can help us improve working conditions, but we need to understand that computers are not without flaws. I drive a truck that has all kinds of sensors and even controls some parts of the truck. They say that it is for safety. When everything works, it is nice. However, when those sensors mess up, there is bad weather, or the computer sees "ghosts"and panicks, it is far from safe and I have to force the truck to ignore the computer. From phantom breaking to refusing to move, these trucks are far perfect. We must keep driver's in the driver's seat to take over when the computer messes up.
youtube
AI Jobs
2025-11-29T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxqf0Yla238p0TlgiB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMDW7SmT8jnd2CXgB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwSEUWzHAkLWpL3MUh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIxUzn1PGLO1hAK2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzChMGiGREm5a8vhPp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxyL8QN_slxWCw68-R4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4-qJJdqfF4rMp8rF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzlXmsJsLkMvma5nSB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz76K58GwqO2u9zpGZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyb5-9gvVHcX1i1FR94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]