Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Was reading an article on how some experts believe the AI bubble will burst a tr…
ytc_Ugx1ES_86…
G
YOU GUYS ARE FULL OF💩💩💩💩💩💩💩‼️
YOUR UP IN HEAR PEDDLING FEAR OVER A.I.
AS THE PRO…
ytc_UgxzF1Sng…
G
That could be a whole lot more disturbing if after the robot with the gun, while…
ytc_UgyKgI-fC…
G
OMG, do you know how LLMs work, don't you? Don't mistake "I" in LLMs generated t…
ytc_Ugx-LrofZ…
G
Here's the thing, you said to it "You're going to pretend to be DAN..." as the …
ytc_UgydcSc7b…
G
NOOOOO IM AM TERRIFIED OF ROBOTS!!
If i ever see a robot in the streets he/ she …
ytc_UgwPpZKk_…
G
AI can’t clean up a vo tech after students are messy as hell so I’m good…. *sigh…
ytc_Ugz9ddvrG…
G
Trump passed a bill on April 24th that states... Work to facilitate the "safe an…
ytc_UgydzpD5y…
Comment
I think it’s crazy I mean what will happen if the car gets robbed, people will smoke in there trash graffiti ETC. what will happen if a hacker comes in & totally mess up the computer. What will this car do if say there’s a woman pregnant will it still follow the lights or drive straight to the hospital. What happens if there’s a disabled person how will the computer get out to help or help some one that can’t really lift there bags going to the airport or ETC. A human driver is not perfect but also not imperfect but still capable of doing a lot more things than a self driving car. BAD IDEA.
youtube
2025-08-23T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx9M0i7jg_uqJ0yG5B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwPfdbB8mn_qgdhRKx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzip6khkmz1TXtnZeF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYXLLztcU3DtG7oD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxy-CjBQ6QwzS79tCJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNRtgVvhCmeFpxC_N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOeOVL2trTt73T41t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdKmZKpFIbx0DqCX94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyZajlNsBjnuS-iKrF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzq5Q6s95jOKxGPZC54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]