Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
With how advanced AI art’s getting, having an AI image detector just makes sense…
ytc_UgxXrdZUO…
G
This is what ignorance looks like. THE LLMS have started showing emergent proper…
ytc_Ugystg-io…
G
Ai art is not real art. It hurts us artist who. Are trying to stay in it…
ytc_UgzWqEH-Q…
G
It is quite possible that today's AI is very brittle, and when it gets to inputs…
ytc_Ugx2vf9Wb…
G
You can't guarantee safety, just like you can't prove anything (given you can't …
ytc_Ugxgog57T…
G
The sad part is Ai could of already taken over and we wouldn’t even know…
ytc_Ugz90pgTw…
G
"Robots will make our lives easier and do the jobs that humans don't want to do.…
ytc_UgxIB3zLt…
G
So, AI isn't quite there yet, but the reality is that AI very much can do the jo…
ytc_UgwIO3EMZ…
Comment
It IS worth noting that, once automation of work makes it “across the line” and becomes the norm, it wouldn’t be necessary to pay folks anywhere near 40K a year of UBI as full automation of work would have the capacity to drastically reduce operating costs, theoretically lowering prices significantly on most goods.
Of course, I have zero faith in the “guiding hand of the market” or whatever they call it these days- the profit motive rules all and leads to price fixing more often than any real competition- I’m just framing a purely monetary perspective on the point that UBI wouldn’t need to be as big of a scary multi-trillion-dollar yearly expense for the government.
youtube
AI Harm Incident
2024-07-28T23:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwdI6AfT9Z93xNOpUx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFV6Tc6mNj5Lxgwz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlrF6maznShhbM9M14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBQLsamdcfSzmeNEh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQyNAjvj9A5wOFLtN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGM3aHmVHwi6yFm1Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw4OJsixUOFk5rK7iZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwH_xpYT5znGhxlllF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwDZF-QRpQBfpMlLdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzMYEpl7ydY8IS7JMF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]