Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He ruined it the time he started saying: we might live in a simulation!!! So, so…
ytc_Ugwu0oONC…
G
That's really my go-to video for this discussion, it's aged extremely well and c…
rdc_glix6jv
G
Crypto is the only answer
Money is printed and at anytime the government can gi…
ytc_UgzkMW5BM…
G
@41-Haiku You mean the people who have a vested interest in AI say laudatory thi…
ytr_UgztqE8so…
G
Ok I'm not an expert, just a regular guy. But am I the only one here thinking o…
ytc_UgxD2mnTB…
G
Yeah, its the driverless trucks that are displacing American truck drivers and n…
ytc_Ugy_IR3cv…
G
AI should take over from humans. Humans working 9-5 should be the past.
Use the …
ytc_UgyJs42BQ…
G
What app are you using? I have the official ChatGPT app on Android and it does n…
ytc_Ugyvnjp2F…
Comment
Hollywood has only warned us about out of control AI that takes over the world hundreds of times. But, here we are out of our greed, the architects of our own extinction. The kind of things people are saying now about AI. Is very similar to what people said about the atomic bomb and nuclear energy. Which is basically this, it could make life better for us. Why wouldn't you want it?
youtube
AI Governance
2024-02-18T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAh9FaV6xTGYqHtwV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzscWgx0DH0Xy-NUTp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2Da1a0oSmHli4jYN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy3JiCMrSg83ASF_194AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9Cux6UaCOig6rnBd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCtnKNzEB2n15bf-x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyN8UPVHoZtmRMiydt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAK70iDb2NwIrn7kx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyrCrKf6Wf9J32hSLl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwckGDFTI9MPe3jyXV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]