Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a tool. Kind of like a hammer. Is it evil?. No. Can it hurt someone. Yes, …
ytc_Ugz6YOxv5…
G
How about Artificial intelligence replacing the wealthy? Hhhmmm? Why cant we r…
ytc_UgyKzwiDh…
G
AI is saturating the net now and is evidently becoming a real problem, i think y…
ytc_UgyBrsQS3…
G
It is already happening. They are shipping jobs over seas to pay for AI.
I am c…
ytc_UgwZ0ubSG…
G
Yeah but the overarching subject matter was regarding AI/Simulation and the prob…
ytr_Ugzv_Ztbz…
G
Want to insure you and your family from AI taking over ?
Cut your monthly expens…
ytc_UgxlLiWGB…
G
But since not everyone is a responsible driver it seems logical to have self dri…
ytr_Ugw4J4inM…
G
@Elon__Bust Correction: *Skynet* is to soldiers as AI is to artists. I didn't s…
ytr_UgyxaG4AM…
Comment
I think one big thing to consider as well is the societal impact of mass unemployment. Okay great, AI has replaced all workers, green line goes up, stock prices go burrrr….
Now you have hundreds of millions of people with no income stream, and no incentives to be productive. Left unchecked you are going to have basically a civil war on your hands. If people feel like the no longer matter, truly, then don’t be surprised when you see mass looting, riots, and utter chaos.
I don’t understand why all these CEOs think normal people will just take this lying down.
youtube
AI Jobs
2025-12-23T15:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwinwHsSDUzXRE00ZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxnp3IvHpf07FPPEmZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyBMPZO3hjuoPNWQLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUrmCRr84scBZiJLh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyHxaBh-ZHsoADzW4p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxo16QbSgRAyD7OaWh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUXXPVbGROKHMhXm14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugydfmzj8xUmFbcAGdR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"confusion"},
{"id":"ytc_Ugxzj96hyXKlxDkKPqd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzP_HY5jPOfvcBnpm14AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]