Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who ever comes up with these titles need to be fired. They are off shifting blam…
ytc_UgxtqmrN3…
G
Google AI was wrong 9/10 times I asked it simple chemistry questions. It only ha…
ytc_UgwoBphgN…
G
When you use an image from a camera and compare it to the images in a very large…
rdc_oa5nomh
G
@Marc-to7jyyou becoming great at something doesn't matter because an ai will be…
ytr_Ugw11iwQi…
G
AI cant draw guns well or military style bags because of how much more complex i…
ytc_Ugx79TWFc…
G
Btw all big AI corpo can automatically remove the nightshade filter and use it f…
ytc_Ugxvfv4ti…
G
The person being interviewed has no idea of the current ability of AI and the ca…
ytc_Ugzj7RWbl…
G
well I just asked Grok and this is what grok said
No, I would not kill the h…
ytc_UgzgTA54i…
Comment
51:30 Question: “What’s the point in trying if it’s impossible?” This reminds me of the Movie War Games, where the Mathew Broderick’s character makes the computer run multiple scenarios of the outcome of nuclear war based on the game tic-tac-tow, and the computer ultimately determines the war game it thinks it’s playing is un-winnable and quits. Maybe have A.I. run the scenarios for the impacts of AGI on humanity. But then again, maybe it would be so self serving that wouldn’t care.
youtube
AI Governance
2025-09-07T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugynqxqep33XKT0Drcd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzS04h_C9D5FYQS_0R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx_WTuHBQJiOZtqb0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5jMhvbr4ssKj7m7R4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxrs_nz3eAkkxKKqEZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2E0y_3y71Lt0MrOV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx230VedZd87OnEOYN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx-L_KJ5QiKOA4p0Nl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzZZ2a2GwmDRAFv0Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_TYtbtkacXu1jDUF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]