Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The dawn of AI bullshit is just another way to fill article space for the uneduc…
rdc_iohsd5g
G
You are conflating consciousness with intelligence. And also conflating intellig…
ytr_UgxVT3wlt…
G
Tips: don't look at their faces, AI is getting better at it, but look at small d…
ytc_UgwLn3a0Y…
G
Altman is also a right wing fascist billionaire pushing AI and aligned with Trum…
rdc_m8jcn68
G
I don't like people using avatar for their online presence in serious aspects of…
ytc_UgxGcm66e…
G
Ex machina, the matrix, I robot and black mirror all told us this is a possible …
ytc_UgzeE4s8k…
G
Companies of all types are obsessed with replacing whatever workers they can whe…
rdc_m6xmn4a
G
Meet Ava Credit Building (Reports to ALL 3 Bureaus):
https://thecreditexpert.ti…
ytc_Ugyf-CbDp…
Comment
This happened in 1979. Robots did not exist in 1979. The internet wasn’t even public until 93. A piece of mechanical equipment such as a press, a steam roller, or furnace, all which could end the life of a human easily are not autonomous. So this man was killed due to a human error when controlling the equipment meaning that to this day there has not been a human killed by a Robot. Also, not a single country out there has hardly even figured out how to make robots that can self stabilize, the technology is so early in its days that it’s hard to consider anything we have out a true Robot. It doesn’t think, it doesn’t formulate, it doesn’t make conscious decisions, it does only as it’s programmed to do. We don’t even hardly understand the human brain so the idea that we have come far enough to have autonomous robots that are making the decision to kill humans on their own is laughable at best.
youtube
AI Responsibility
2025-12-21T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwFcIhmPBmzyBG3Tud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNPvsIHMO0cUzw9SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyn3b6hw7hOcXdae8R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFsRRwIid_EZ3zA2F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyYl1dPUYCHOoXoy3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyxrXRiIfnyfJoYJgZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz2lHsIl7RWse1adnl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyFyN3SwIo-PupZvZJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfJ0tPTF0K4Ny3SsF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxOqEnBCCIFcPNAnYB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})