Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I imagine the more systems are automated, the more decisive hacking is to battle…
rdc_ohs92k1
G
I dont want to be disrespectful, this is a tragic loss, but this is most likely …
ytc_UgyyaQHdo…
G
I'm absolutely despising AI at this point. Like the only valid use of AI is in t…
ytc_UgzYto5M1…
G
AI needs anthropomorphic robots to take over. James Cameron nailed it with the "…
ytc_UgzvSXAMf…
G
Our greatest problem is not AI, it is our own personal ssin gainst a Holy God, w…
ytc_Ugx4FUrrI…
G
Trump "trust me bros ai wont makevthe electricity go up a lot just like tariff h…
ytc_UgwNCG_D5…
G
Putin sends troops into Ukraine on electric scooters, motorcycles and regular ca…
rdc_mcqa6n4
G
Meh it is what it is, why should humans change their ways? We are Destined for w…
ytc_UgxavxUlR…
Comment
I recommend reading about the work Joy Buolamwini, a researcher, computer scientist and founder of of the Algorithmic Justice League has done on racial recognition software. Her research results got Amazon to issue a moratorium on Rekognition, got IBM to pledge to stop developing facial recognition software, and Microsoft to withhold selling their systems to police departments until regulation was in place.
youtube
AI Harm Incident
2020-09-03T14:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLfkwLeu8UzDmdbid4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwTsEBjA9AHZFv8aj14AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwtvFbw_-Vko2ZHxkl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgySCRV5oIsEsX_12KB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzZQyrC1ghCxe9z30t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_0sh6vnFhoXUdlRN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7deQLaoF5J_ExAuR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwL04CQ6QumqtfItKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwQecE-FoTS-7H_C6t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyu3h0kYGZKiLifqEp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]