Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
software problem, solution is to make them incapable of caring/asking about that…
ytr_UggnjeyPz…
G
Quite funny to realize just how much CO2 is used by just people thanking AI bots…
ytc_UgwsE6Elf…
G
Yes dear, AI reduces stress but can result in misconduct because a tutor can cat…
ytr_Ugw1pD2NH…
G
@crazando hahaha how funny. I guess you never read any copyright laws, I created…
ytr_UgxKyAqdM…
G
This is a losing battle, AI is the future of humanity, we can do something today…
ytc_Ugy1ox4-F…
G
Please bring Geogrrey back. Wow.... what an interview. Amazing seeing all these…
ytc_UgxyCnrmr…
G
IMAGINE THAT THEY DECIDE THAT WE ARE NO LONGER USEFUL AND START TO ELIMINATE …
ytc_Ugyr7v9Lj…
G
Can governments ever make laws that will safeguard us from AI, they can’t keep u…
ytc_Ugx2ECU4G…
Comment
In real life, spotting that woman would've been easier as the camera is limited to its dynamic range. And if it's a Go Pro or any thing else like that, then it's much brighter in reality. That said, what a stupid way to cross a street in the dark spot (we assume, could be the camera) and not even looking. But, a self-driving car should obviously not be limited to visual light but rely on multiple systems such as radar, cameras and perhaps lasers for obstetrical detection. People riding in these cars must be pretty dumb to think we have a working system. It's more like 5-10 year away until regulations and technologies are up to speed.
youtube
AI Harm Incident
2018-03-22T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxHy7uvJZhpcmxlll4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRLMUoUAjClat7CG94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgyL8WBj8DHdQk_Igr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_fxnOM5I7wYFz85l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxlunewTn_ITmIHzPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzfw1Jw6CnMajwDjZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgweCGqcYlfeYk4zsgZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgypaD3yhSCq87jelDp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwbg15AM1VjNdzADvx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy59dP3dxKmQBJahuF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]