Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A friend of mine for some reason wants to know how durable that mouth is.…
ytc_UgyMNxhHU…
G
I've been an AI art defender for a long time now, and I fully believe it will ne…
ytc_UgxWkZbXH…
G
Why do you need human subscribers? Just let AI do it😂
Podcasters, influencers, c…
ytc_UgxwbNMB9…
G
This thesis argues that such assumptions are anthropocentric: they project human…
ytc_Ugw48YI08…
G
oh dont worry. All of us slaves will get fed after we pull 30 ton stones 15 mile…
ytr_UgwdYSYBh…
G
Alrighty, then let's ask the super-intelligence how we solve all the big problem…
ytc_Ugws3JrTu…
G
As a full disclaimer, I am not a visual artist but I am a writer so I'm also aff…
ytc_UgwygDxqh…
G
Black ops 2 is going to be a prophetic game decades from now. Most of the techno…
ytc_UgyGqH4Wu…
Comment
They did a fully researched study, and it was slated for 3 years from now, along with the fact that AI is willing to kill to survive. I used to say this but was always met with scepticism. If you think back, planes were completely sci-fi just a few years ago and even smartphones. So, I feel it will be the case where we have a massive leap in technology and very soon, since AI was so dumb just 3 years ago. Now it can already read and have context for a novel and 50 hour long YT videos with little issue, and cross-reference multiple different media to apply reasoning.
youtube
AI Governance
2025-12-04T08:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx54ES73pDjvV5uVvd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzqNV4o7MhLDWvQDMx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwy97P98Iq-qIePeX94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwxufZ9ZlVL45HFVn14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxziBVXBaiUno8LVgx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyMij204E8GzcVJ_sd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugya5XB0VmT-Br_-KN14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwY03rLgvjuSBfzbrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPTUY4UvsXggnOwaR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBWDzgNapo2yLTutd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]