Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I work in developing AI for younger children. We already see bugs and errors we …
ytc_UgwIBLEyR…
G
I was wondering what you were up to today when you just uploaded this video, so …
ytc_UgxjBom2H…
G
We may as well have AI politicians then, at least they wont get blackmailed via …
ytc_UgwmXSwoF…
G
SETI failure to find ET is thus explained :
Techno-Lifeforms invent AI not long…
ytc_UgyHPdpNx…
G
I know the new talking function of chatgpt, but can this be edited? I mean, he d…
ytc_UgxzVEbYg…
G
Well if you want an actual automated car, you'd have it do rough terrain, heavil…
rdc_cpno3yg
G
FOMO (fear of missing out) is driving this craze. Technology is a brutal process…
ytc_UgxC1Hrkj…
G
Cartel who created such projects as "Our Adolf", "Garbage off" and "Put-in" will…
ytc_UgyvbXDri…
Comment
AI and automation are GOOD, though. It's annoying when fellow leftists get luddite just because of worker rights or something. I am a leftwinger because I want to improve the lives of everyone. I don't think working to death is any kind of good goal for anyone.
The problem is not automation. It's greed, centralization, and power concentration to a select few.
Don't let them regulate away open source and hiding behind safety bs. It's a liberatory tech that needs to be decentralized.
A big problem everywhere is that no one has society's best in mind. They think about how their own little table of numbers should improve. No thought on how that impacts society.
youtube
AI Harm Incident
2024-07-30T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyV4QEKIjBEC-x6wiF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAa6rytLq4UQZZKD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyY8FTV0wPom4K1ZAl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxftiUIJVc70ODF0vt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHvmCWZueeSr1wz3t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWHnvh9APQX5U8ieB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw837boFl6rxtSRxB54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRIz9WSCHdINkQ-_94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8IkrxQyk7D5UJxRB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKtrICs-WgMhcXGjp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]