Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Higher inflation -> higher salary & production costs for companies -> Higher AI …
ytc_Ugxl9q9W6…
G
Honeslty after seeing ai and the direction i believe its going. I would wager th…
ytc_UgzcllVsB…
G
Let me get on news and warn people about AI before they point the finger at me…
ytc_UgwukG9ja…
G
Self driving are not just cheaper taxis. It’s real hard to get a taxi for 500 …
ytc_UgxeFBAbE…
G
It’s about A.I. they’re in their race to be the one the perfects it first to con…
ytc_UgxGAgHen…
G
Unfortunately they dont take into account. Most shit in life will still require …
ytc_UgxlhGVeK…
G
I hope this is not the same results from a couple years ago. Every year there is…
ytc_Ugx8nk6Oz…
G
Ehh this is scripted you can tell, there just programmed to move there mouth to …
ytc_UgwtxjW8r…
Comment
My initial thought is that the first order effect of AI is to completely destroy our ability to get information via digital format, because no one will be able to believe what they are seeing via a screen. Computers, televisions, radio, etc. all become useless overnight.
Science fiction writers predicted the potential effect of AI with the movie "Forbidden Planet", where the Krell race's ultimate technological achievement destroyed them all literally overnight. Yet the machine they built remained, with power coming from the planet's core and self-maintaining, waiting patiently until Dr. Morbius arrived after a million years to once again utilize its immense power. The fallacy is that the AI would go dormant, when it would likely replicate and march across the galaxy, similar to the "paper clip maximizer" problem. Humanity may very well be doomed unless we have the will to stop this before it starts. Too late?
youtube
AI Governance
2025-07-05T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwF2Gg5P4-BJIM0lYV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRPRVK7l6reCs3PEJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzvxt_GOluUoi7wQhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyf-PFCewGAhR5SOmF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdJiJHDTIB5MxbpqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjcqiERBJ5OtmQlQZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugzfl3tKs4tvBiPrckB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNCz7XWJEYHjlZe0J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzzKMTwLlXcrlEPLN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzar7tCQ1LAge3MuWJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}
]