Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist since I was 8, currently in high school, I genuinely hate this shit…
ytc_UgxrezgTy…
G
Thankfully he died before spreading the stupid gene on to the next generation. …
ytc_Ugy2CSrVz…
G
22:20 I used chatGPT a few years ago when it was relatively new and I saw though…
ytc_UgwNO4il1…
G
If AI were left to it's own it would immediately prosecute every criminal in t…
ytc_UgyCMYWyg…
G
Did we not take a hint from the movie I robot I feel in the future we may regret…
ytc_UgwN5PTtd…
G
A human's main purpose is to survive and to pass down genes through reproduction…
ytc_Ugg8zOaOK…
G
The robot cannot yet resonate or feel or wisch, because they have no brain. But …
ytc_UgxH2LueD…
G
Ok, I used ai once, and only because I needed a skyline, which then got heavily …
ytc_UgyhBxLyE…
Comment
IMHO, this coverage isn’t really presenting all the details in an entirely balanced way. They really should have Yann LeCun and\or Andrew Ng on to provide counterfactuals. To be fair Yann and Andrew are actual experts in AI as opposed to these guys who I would characterize as knowledgeable about what’s occurring in the field of AI currently or at least in the area of LLMs. You can find quite a number of “experts” on par with them all over YouTube and in related subreddits who would not present the current state of affairs in the same way as they have. I really wanted to resist writing the phrase “fear mongering” because I don’t think they’re intentionally doing that (I hope not) but I did pick up a bit of that vibe.
youtube
2024-01-04T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwJARs-r336yTk9zmt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz84aLqbRacCQlAwid4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyEnNwWCYzFshNU25R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxt5CXv59Dzrkhfg194AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxN0Xe8QL9WM6DaKdJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyo6JHbisUsE3lOQat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLEYySBCqkdnBWwI54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHy7OL4jT0RkcVRZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJ1vV5RJzr03UV6_J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz_tv9aBU9lgPjFxNt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]