Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'll be the one to get emotionally attached to a pet robot and cry if it stops w…
ytc_UgyQLeUXl…
G
The danger from AI, will not originate from AI itself, but from the few humans w…
ytc_UgwTU-tt8…
G
Meta ai : 9+10 is 21
Siri: GodDamn meta is so stupid
Gemini: agree
Calculator: j…
ytc_UgxI99gus…
G
Most of those jobs AI is taking over are tedious and unrewarding. Think of Sever…
ytc_UgzZNdLIN…
G
Grim sceolon ða geonge leornian,
On bocum and on reccan stafum,
Ac ðæra innoða f…
ytc_UgwvI-b0w…
G
Bring traditional gender roles and nucleus family structure back. Due to automa…
ytc_UgxJa-B6z…
G
AI will be a another big mistake as the globalisation! it will impact 70% of hum…
ytc_Ugx5IFSTP…
G
I practice law, it’s not the same assistant. It could write entire contracts and…
rdc_jhdu6al
Comment
Driverless trucks are a huge mistake; in fact, anything automated is a horrid idea. There may be several defensive systems that can prevent a hacker from taking over control of the vehicle; however, if the hacker is determined enough, they can turn an 80,000-pound vehicle into a deadly weapon. Do you want a fully loaded diesel blasting through a School? What do you think about loading and unloading? How is the government going to regulate the hours of service of an automated truck? What do you think about the scheduled service? How many mechanics know anything about the new systems on the truck? What if there is an Electromagnetice pulse that stops the truck? You would have an 40, ton paperweight. No! We are not ready for driverless or electric vehicles.
youtube
AI Jobs
2025-10-05T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzEurzaC7BBQYJdWJ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy1A0y-oOuLHtvwI6J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy4Ir8LuPySas31bk54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxQivGUU5Tni_5Y0mN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgxNNIAgAezJDLBS_9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzI4wFXxKO1yhpOkp94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxeAwGlyU9IAGn99Ed4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxyMm-DBf853Io-_vN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgxZUeloHRUIuy0GG6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzUCMy_ou59NigT42J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}]