Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree. Further, my thought is that if a conscious self is a combination of mem…
rdc_mdjnfdi
G
Our current method of schooling is absolutely soul sucking. I’m all for trying s…
ytc_Ugxotz9dY…
G
Who is goi g to buy a cabinet if they don't even have money for food?…
ytr_Ugzw9ExRZ…
G
Disney gonna roll out its own generative AI modle for its own ips, for Disney + …
ytc_UgwSkiF_P…
G
As an artist, I don't really have a problem with an AI... when it's utilized pro…
ytc_UgzBJGMg_…
G
Don't bash DeepSeek, pls. They made fools out of you Americans 😂. Spare me your …
ytc_UgxqwhOhR…
G
Ironic that he won a Nobel prize for one of the worst inventions in human histor…
rdc_lr7py00
G
Where is the system to support the millions of people working jobs that will be …
ytc_Ugxb3k3cH…
Comment
Humans created machines to help us get things done faster so we could do our jobs faster and fulfill consumer wants. At this point, the tech world and shareholders are really not thinking things through. Sure you make money now. Eventually AI will replace a majority of jobs. With no safety, this will happen soon. When this happens, inflation will only get worse and the job market will be even worse than it is now. Why give a wage people can actually live on when you can pay AI? It’s already happening. When this happens a majority of people will be suffering to meet basic needs and the consumer market will cease to exist. What will happen when there is no more consumers?
youtube
AI Governance
2026-04-16T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXu_juwU6nqVgApKB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRqGx28NGPaFeTyAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzlidWA8eRtBJ0DqfJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1mxS9NJO27dT8cOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKqDgBycZhFwOJK7F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwj00gSEUDmjL7MaM94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzN-UcIWyG9WiXRwYV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEV9xxClYiVYRWu3d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgzuV1JZQ2xRbzl2OH54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxnucDc7OWdhdz3Pxx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]