Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Connection strings.. so it’s averaging up data with weights and approximating an…
ytc_UgzyFweHr…
G
It was a great presentation. However, how can we learn more about the Ethics in…
ytc_Ugy6HM8Rp…
G
LUDDITES is the name of one of the very first organizations that fought automati…
ytc_UgwO_MoKR…
G
if i could send my AI to call customer service lines that would be great…
ytc_UgzXS-t9A…
G
Autonomous cars just aren't the future of mass transit. Trains, busses, walkable…
ytc_Ugz56FDni…
G
I couldn’t think of a less satisfying activity than ai art.
There is no effort, …
ytc_UgyHJMLo_…
G
You never coded a day in your life and see the latency handshake between sensors…
ytr_Ugy_6aFmU…
G
Bro the second I tell any one else that secret that AI will be alive and at my d…
ytc_Ugx3IHUNa…
Comment
Driver less no. "Self driving trucks", many of these rigs will still need a person behind the wheel to take over for different tasks. Who will unload them? who will wright a com check, who will wash them who will change the tires? Who will tow them? These guys are idiots. Its the same attitude you will have to deal with from management weather the trucks are self driving or not. What about security , is it going to know the difference between who is supposed to unload the trailer and who isnt? Your freight is being unloaded at the side of the road, alert alert lol. What you will need is a robot behind the wheel.https://youtube.com/shorts/gOYAfEOeg1Y?si=V16mB7nqZ1O6z-0P
youtube
AI Jobs
2025-06-18T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwG9LdvcG3LOtGk3E94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwxx48QJglexOKKyKd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwe8R64k7D5kmxeCJF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUqEoGDPHGCI0_sqF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwN6BkAqZbM8GSCoct4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxpgiS9WPeLC585Xad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrXPLLGjub0s8jnnd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz0TpfRuTl2hD0dJtZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYa6-_6s9ZGpoFCF54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrHvcRZ6dOxzq19Wh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]