Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
First mistake is talking about this publicly in the news, the AI machines are wa…
ytc_Ugz9adVOO…
G
All AI does is looks at datasets inputted into it and then learns that informati…
ytc_UgzDlRbQd…
G
I genuinely hope you realize the difference between using a stylus on a screen t…
ytr_UgxZ1H6fn…
G
Not to be an edgelord, but people fanatically believe into what was written in a…
rdc_munu7o2
G
This is really accurate insight into the big changes coming with AI. I believe …
ytc_UgxIHMC7m…
G
AI has already passed human intelligence seeing as you have come this far and ye…
ytc_UgyqFhZji…
G
Nick Fuentes is definitely not a Trump’s support lol. Also, the narrator sounds …
ytc_Ugwx3pmKN…
G
Very interesting views. Though I agree that some jobs will disappear in certain …
ytc_UgwNZUDj5…
Comment
The belief that ‘new jobs will always replace those lost to automation’ overlooks key economic principles. Historically, technological advancements have created more opportunities than they’ve eliminated. While it’s true that new jobs might not immediately offer better pay, they often evolve, leading to higher productivity and eventually higher wages. Instead of fearing automation, we should focus on fostering skills that complement technological advancements. This approach will ensure that we can all benefit from the efficiencies and innovations AI and automation bring. It leads to overall better resource allocation anyways, because I assume you wouldn't want people to be the ones constructing cars physically haha.
The belief that more regulation is going to help is utterly inconvenient for technological advancement and as it has been proven, more state regulation only causes more wage stagnation and other unintended consequences, it just doesn't work.
You wouldn't want to ban printers because they take the jobs of writers would ya?
youtube
AI Harm Incident
2024-07-28T16:2…
♥ 32
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy5woM8iA1mrULVIR94AaABAg.A6RioFj-lG9A6RtR3_EZKs","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy5woM8iA1mrULVIR94AaABAg.A6RioFj-lG9A6RvO80o1KM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx32Qcd6vKs7bIG4-14AaABAg.A6RiK79Pz_LA6Rm4LfqDZg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwzYqRyrtR4qdPxjhd4AaABAg.A6Ri5KXvN8IA6Rnvgh5PLs","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxFy_65s-AiZYOJW6R4AaABAg.A6RhQxvJrBSA6RkTpORRrS","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxFy_65s-AiZYOJW6R4AaABAg.A6RhQxvJrBSA6S33ERmuVl","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwJIAJN6J_-qx453J54AaABAg.A6Rh8xPKEgfA6Rn3bgKHIj","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwuIg8n90y8q9fn4sx4AaABAg.A6Rgv6feTCCA6RikvNa-QX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwuIg8n90y8q9fn4sx4AaABAg.A6Rgv6feTCCA6RoEc_DyeL","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxPlwp3f1o2PNKbOSB4AaABAg.A6Rgmm9tFXeA6RvsZN2sws","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]