Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The unincluded issue is AI is terrible for the environment, indefinite Ai expans…
ytc_UgzOrtJQT…
G
Imagine answering your door and a robot standing there doing that with your pack…
ytc_UgyF61xSh…
G
Already saw someone on Reddit posting on how these tools are NOT doing what they…
ytc_UgzFB3ebh…
G
Y'all don't understand the true danger: If AI can be the exception to copyright,…
ytc_UgyE7J2Fx…
G
Whats disgusting was AI "prompter" (those who called themselves as AI artists ar…
ytc_Ugyo76K6b…
G
Relax. There is zero evidence that Artificial General Intelligence (which is a s…
ytc_UgxvztazT…
G
The headline is completely sensationalized.
Here's what they actually said…
rdc_n3nwgve
G
Publicly available information shouldn’t get someone sued. Even the verbatim sta…
ytc_Ugx_s3REV…
Comment
You have to be very careful with learning algorithms, it's impossible to know what it is that they're looking at.
An example of this is the x-ray machine calibration. If hospital A and B are used as training data, but hospital A sees sicker patients while hospital B sees healthier patients, then the algorithm will learn that x-rays that look like they're from hospital A are more likely to be sick.
youtube
AI Jobs
2020-03-09T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgysDAyUWrIYwKzzmyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqCs6PUbAmtQAX1WR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwHQKtik2vdVq5pVAl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8Cl47BPtal5YVjgh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhOkBZQ3kvtfGVN-Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLEdDgOsTO9RyUphJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxiFcJRPmzyWVvxmY14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzaF7IWGfeu2LR2FSB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgycSM9hDY5-pDanYJ14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbJWRgZEl4Dn-n8UF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]