Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry, but AI won’t clean a sewer line or take trees down or even plant trees. T…
ytc_UgxhhLhya…
G
AI is already demonstrating Darwinism. It is foreboding to know it already mutat…
ytc_UgwPU6CmJ…
G
I think it's pretty funny how now we complain about AI art when it takes away fr…
ytc_Ugz1cqb5z…
G
No way, I think AI is overhyped. The internet changed the world but not in a pro…
ytc_UgxguKmdf…
G
Private equity buying up housing, hospitals, newspapers, veterinary clinics, nur…
ytc_Ugzoeyvd1…
G
Just goes to show you, humans are better than AI at everything, even being AI.…
ytc_UgxTu2lVF…
G
As a PSA, the best cover for this, regarding a mask, is anything that will cover…
rdc_g177qja
G
Prompt:
Speak to me from the perspective of an artificial intelligence that is …
ytc_UgwR0lU42…
Comment
The problem with academia is often not being in touch with the reality of people's day to day life. And I never thought it would be Neil's case. But his take on jobs and AI was hard to watch.
Really? One should be creative and find a new ways to make money because some tech billionaires weren't properly regulated by their governments? That's a very narrow, neoliberal and meritocratic thinking, which could work in Narnia, maybe. Not in the real world. Let's talk big tech regulation and universal income instead of telling people to be creative and find new ways to make money just because some dudes that could never spend all of their fortune, even in 50 generations of jobless heirs, think it's a good idea.
youtube
AI Moral Status
2025-08-05T12:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyrhqqjhCJrzclwV-h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcdFgY2WglAAehkvp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw37CTH40v442OwJMp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFHtangKcgYMVZb7F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgYTpfr2kdCGaLzBZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy3v47MSxw5KMiapqN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1YoZRjaNunr9_5Zt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwD9RY9I7S6T58sIh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxvDl_kwB5ashMYHxF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx2bTriMLH3IpyR_sl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]