Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.I. is very annoying and should be regulated somehow. It's one thing to use it …
ytc_UgwCDpu2i…
G
Loved that analysis. Make sense that AI is a tool to push down the value of the …
ytc_Ugy6NhAZW…
G
The weird thing about the AI discussion is that all the experts are in total agr…
ytc_UgyLS1peW…
G
The problem is not ai trash replacing human art. The problem is ai trash takin j…
ytc_UgyWyck7x…
G
This plus the Lorcana ai situation give me a bit of hope in the battle against a…
ytc_UgzEORUUp…
G
Lily Jay stop asking Chatgpt because you do not want to look for real informatio…
ytc_Ugz9_19Fv…
G
Driverless cars are the wave of the future, much like the horseless carriage or …
ytc_UgjQ-5m5o…
G
As a teenager who is the same age of the victim, it all depends on safeguarding …
ytc_Ugzzrdq5V…
Comment
Just did some Googling; turns out most of this isn't true at all. There's no such thing as hospital AI, it'd be highly illegal to implement such a thing. Also, after the Robert McDaniel incident, the AI was never used again, and no police department since has since made another attempt.
The first thing is only a LITTLE true. The AI was created by YouTuber Yannic Kilcher, not a university. It was a simple chatbot AI, and all its data was taken exclusively from the 4Chan group /pol/, which is notorious for being the most disgustingly hateful space on the internet. It was made for fun, never given to the public, and promptly deleted.
TLDR
This is all made up, and this guy is a scumbag trying to scare people. Always look things up when people make wild claims without sources.
youtube
AI Bias
2023-01-02T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyjwn2ueW_3WhCZdQF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRTpkf1BCITxrY3t14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwU_a4gtlt6vX8_DJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbdrOCN3zU2hvi7nR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzBuUQfyFO_JgPVvml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxs8JSW4bxzQn8VD0V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw_iD_GQx44LFBMw6Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7vWRyQdnBuV-c5H54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHMIl4z8iuvSihgZV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNoVjLRHe_ITKYc_x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]