Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked Claude to perform an analysis on potential excess morbidity per annum du…
ytc_Ugwz0sDaW…
G
AGI is a myth.
LLMs (with or without recursive self-improvement), have ABSOLUT…
ytc_UgyRG9LJC…
G
At first I thought they were referring to Turing completeness. I don't think Tur…
ytc_Ugz7kGT6S…
G
Ai will be smarter than any human in 5-10 years. Be wary.
The end is nigh.…
ytc_UgxSJ42in…
G
I really do think that it’s unfair that people’s art is getting stolen. We put s…
ytc_Ugyxxiyy7…
G
At 77 years old I would have thought this bloke had some real insight into AI is…
ytc_UgyeLai4f…
G
This is super interesting, but my Gosh, 10 minutes and already 2 ADS. And if you…
ytc_Ugw-u1wPG…
G
Cry me a river. Like you don't know it will happen. And they think the whole wor…
ytc_UgwuO0oYZ…
Comment
Why the hell are we green lighting experiments like working human neurons into AIs. Most ppl struggle to get jobs, we dont need to advance technology for the sake of advancement alone. Tech is here to better our life! Why develop tech when it stops mattering to the average person welfare? We dont need super intelligent computers, but we do need more compassionate leaders. We have the priorities all fked now and the financial incentives is driving this every bit in the hoarders of big tech owners. Speak up guys before no job is left for us.
youtube
AI Harm Incident
2025-09-11T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyssvt_SsowG0Jyup94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3HOkZ5ptP_KNsnFV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7kIenXrL9WgdFsEt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwVeECo2XCjgv6Y-fZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9dYXjtUhOT5sfDfx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzOETw8f9WPniYu3FV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkxxcAkmc_LuO3Pnd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZHF9kT0cn6pdtbZR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw_OABAnkYuo_S-Cgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzDFiKQ_2-OI4z3SGJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]