Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Terminator-like robots are not as much a concern yet because self balancing as w…
ytc_UgxVcGuat…
G
Idk why some of you even call it Art. I always called those AI images or videos.…
ytc_Ugzs7aVrU…
G
This was an incredibly under researched video, I'm amazed how little information…
ytc_UgxNnOwV5…
G
@AdamBA380 it really isn't, most AI models are made in by people and release for…
ytr_UgwJIGJJQ…
G
AI models have already identified thousands of potential cures for hundreds of d…
ytr_Ugwea9qYK…
G
@EmmaSquire-ks9nu dropping a bucket of paint would’ve been more hard work than u…
ytr_UgytEDC4z…
G
Bill Gates is just advertising as always, as Microsoft owns a significant portio…
ytc_UgyLDU3CV…
G
I'm basically an A.I. because with words I can make you "question your whole lif…
ytc_UgwXVIKbo…
Comment
Tristan Harris and Aza Raskin run the Center for Humane Technology, a group dedicated initially to exposing the social and cultural horrors of social media. They produced the very popular (and still worth watching) documentary titled “The Social Dilemma” which showed how social media became, as Tristan quipped, “A race to the bottom of the brain stem.” Now they are also working on the dangers of AI as well. Both formerly involved in the early stages of building social media, so they have substantial insider and technical expertise. More power to them.
youtube
2024-02-14T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxKnlgYk5QYCoAjFHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw_pOyn3NSygHKak7F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4rm_JVVWpJrocYCp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxiEHwNfBdCheJdJbV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw-HckPqKO--Fgn7wN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxpBKP37xa9O4yxoqh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwjS-ZcidaOcg-g0FZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJuvNJfzqnB-l1SEJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdgRZmERiBucv6CPJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxLMZ5oY2SX0rWGSa14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]