Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Charlie needs to talk about the woman who reported her art was stolen by custome…
ytc_UgwvK_GDN…
G
I would like to hear Tyson's view on the anthropic tests showing that every sing…
ytc_UgwOuLFZe…
G
If a robot learns to kill a human the message will be transmitted to all robots …
ytc_UgwT6hpo_…
G
I'm a programmer (who would love to, and wants to, be an artist) and I'm all in …
ytc_UgwmyYU-5…
G
Basically:
I call AI """art""" AI images because i'm under the belief that true…
ytc_Ugw47Xy1M…
G
I think someone who is using a radar cruise control from any GM, Chevy, Honda, T…
ytr_UgyIzo9MJ…
G
Recommended reading: The Economist, February 7, 2026, "Artificial intelligence …
ytc_Ugwse5W5b…
G
I love how ChatGPT tells them it can’t give legal advice multiple times. And the…
ytc_UgzTxeNs0…
Comment
AI isn't going to destroy humanity. It's reasonable to expect that by around 2029 (give or take a year), things will have stabilized. The initial knee-jerk reactions will fade, people will understand how to use it effectively, businesses will have either embraced it or moved on, and a generation of kids and teens will have grown up with it as normal.
And by then, something new will likely be emerging to drive the next wave of innovation. Companies will still need engineers and developers to help shape whatever the next frontier becomes.
youtube
AI Governance
2026-02-25T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_BxWIzW48C8tOHlB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxlIwijgiYmoUYe0VF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwU9brjyXaQQB8chgp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwGCn0Yy-d3VhSR-mB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw7Rq7fChMg0dtZZFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxFYhSLIVkY6Dlu3oh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_PFy4UVHHuKdFu5l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPylUV2bS3_0LUCg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws-9lw50vSnMNX15t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw4pNHPNMzXrF4T7Wp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]