Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we wanna bring the portrait and photography industries into this, how about t…
ytc_UgwBQyiMo…
G
1. Sure, if social media dies, it's good riddance—except it won't die, it'll mor…
rdc_le6itnc
G
the one real issue i have with this episode is that it's focusing on ai abilitie…
ytc_Ugz44jWKc…
G
Turned video off at 1:28. Existing structure and conventions is exactly what you…
ytc_UgxlnP-GV…
G
Use to have 5 mph bumpers, mandatory seatbelts, daytime running lights, and back…
ytc_Ugxf0ITWd…
G
honestly i cant overstate how much i love this video, particularly what you said…
ytc_Ugx60DJVQ…
G
Using Olovka for essays is a lifesaver. Keeps my work original and well-cited, s…
ytc_UgwWM8mM8…
G
Because if it can be proven that LLMs are conscious, then they are persons, and …
ytr_UgzTZCLIF…
Comment
Proof scribblenet + openpose > loras 😮 (everyone knew this already)
Also of course we're going to use an already trained model rather than retraining, you realize that for a trained model the water has already been spent? Making your own model has a lot of problems, the yoink and the powah and da watah, nobody is training ais willy nilly because of the immense price.
And don't even think about eating an almond (one almond) because 1 almond = 337 ai images at 1088x2160 😮
youtube
Viral AI Reaction
2025-08-27T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxXkfQhWJDWwxX5Mv54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzy35XA3kQqHC98X-R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzscNZobwY2gyYEpkl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyAHwfFrJPjEjK_M-p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},{"id":"ytc_UgxRnzXUIiMgO8QJnaN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},{"id":"ytc_UgzyGNGr7qgrTjnnhjB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgzvTj2SJYPkdNTeSpd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgxilFAmLY4fR8PH1c54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzpH4kZqrtLYLAPBPd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzjkapHuET4QhshBI54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"]}