Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
one thing that bothers me about all these conversations is that humans are alway…
ytc_UgwbhKyWy…
G
Rather than seeking to create AI leading to an "us and them" type scenario we sh…
ytc_Ugxm_8_T4…
G
Waymo’s argument: autonomous cars are safer than human drivers.
Reporter’s argum…
ytc_UgzO3WiQX…
G
No one in my family of 5 is working past this week so I geuss pretty fucking luc…
rdc_fn5y6ca
G
The way the other artworks seem more lively and creative than the AI art 💀…
ytc_UgxkpDZmY…
G
Man, fuck robots, Robots literally have no empathy, it's all algorithm with a fi…
ytc_Ugj4vS6AR…
G
As someone who used different models (SD 1.5, SDXL), it feels like generating an…
ytc_UgyjkRZLc…
G
True, plus the Take It Down Act was signed into law so this means that victims w…
ytr_UgyrcI4Sh…
Comment
I am not concerned about AI in the sense of it taking over as our overlords. I am worried about AI in the fact that its going to put tons of people out of work very quickly and more than half the people in the USA would implode if we even suggested universal basic income. The problem with AI is that its in the hands of greedy, insatiable corporations that have no regard for human life. On top of the fact that we are still very much reliant on fossil fuels and data farms take MASSIVE amounts of power. It is an ecological disaster and a tool that the elite will continue to use to exploit or replace the poor while they get even richer. Until we can figure out more efficient and clean energy solutions and also guarantee that our people are actually taken care of, AI is not the solution to any problem and in fact will just cause more problems down the line.
youtube
AI Moral Status
2025-07-28T07:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy87oekTgX2Qh8jSdx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyxqrMznj9Dt2jPnVd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRWAsBg1n5NB3nft54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwJhd0qqJF-l6G-XR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLBgEMjhI9sb8OXTR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfadnV-7kUPqs9k0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxQ_GyXT6Top7t6qyZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwpaE37W_c9PXl_TiN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy0Pfk3RRKA3gHdrqF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwV8ThJGibI0xXQd4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]