Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if the false pretense had something to do with the support they'd be given …
rdc_cjooja7
G
I've been trying to tell folks. This shit was all AI come on who is gonna record…
ytc_UgzxJn7pg…
G
Bad news for France and Germany. Now they have no chance in winning the AI race.…
ytc_UgxsVg0_5…
G
No code and low code has been around for a long time. I use these tools every da…
rdc_jab2u24
G
How about the economy is being badly managed, costs too much to employ people, …
ytc_UgzGF9E5H…
G
They view AI as efficiency because efficiency brings currency efficiently, thus…
ytc_UgzKcgRPD…
G
In my experience ai is pretty good when you feed it bite sized instructions, gen…
ytc_Ugwdwsd0R…
G
Can self driving cars be more dangerous than drivers with cell phones?
Wanna g…
rdc_dmp43kq
Comment
Truck dispatchers tell drivers to do dangerous crap every day, and their drivers as professionals have to tell them “no.”
A lot of people are going to get hurt when somebody in a warm office 2000 miles away can send a truck over an icy mountain pass at the push of a button. An autonomous truck doesn’t fear for its life or career. It does exactly what the company tells it to do, and that’s a problem.
youtube
AI Jobs
2025-05-29T03:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyeFZg1pyYyGH0b0wd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzcfL4AEG978upHNj54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw08YXEkacYADtXa514AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwu5aVHITgnPY6X4Ep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-Rdno-6gaVeVBDzx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw3rByoJ_kEoUzXRol4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxi6YTmGjP8zUKNCwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1DJrtQeCUlK4kyDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPPuNYeWKtpT-rZ4Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFHb-8PQmDZ6Hn3qB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]