Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Ok_waffleit stands for character ai, where you can talk to ai versions of ficti…
ytr_Ugxa_CaYz…
G
Absolutely. Unfortunately he's not selling pretty vacuum cleaners, he's selling…
ytr_UgyNUsmnQ…
G
What would happen if, theoretically, a person were to record everything they do …
ytc_Ugxz-1R0p…
G
Highly exaggerated and overated? Lol,this guy is a pioneer in AI and has won num…
ytr_Ugx-BV36q…
G
AI will not take over every one’s jobs. There would be war and mass chaos in the…
ytc_UgyFBkn9r…
G
A.I. calling itself MechaHitler is all the proof I need that it isn't good, for …
ytc_UgzS9aCKN…
G
Honestly after a proper education in machine learning (I specialized in reinforc…
ytc_Ugxx9fy1M…
G
Why push AI into art? I don't understand it, using it as like a goofy way to gen…
ytc_UgyhglJcg…
Comment
In 1906 John Philip Sousa warned that recorded music will destroy all musical ability. In the 1920's Leo Baekeland predicted that plastics would solve all industrial problems. In 2008 Bill Gates predicted that speech would replace keyboards by 2013. Doomsday predictions have always been a part of recorded human history. Humans evolve, learn and adapt. Dr. Yampolskiy, while obviously a bright fellow, is considering AI's evolution while believing that humans are static and not evolving right along side.
youtube
AI Governance
2025-09-08T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwNZcOWd3YQ7cAS2Ep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9K5DybkKU4akq8iZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8kNRnWSBnvbojN1B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyTOhzqSxGRjlfei5p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwhe90Ce0Or9iooAqV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyuJ6o9LMEaShnc-Ft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuL9fDcxdtLFQgEt14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY_4CNadQ-VrYiF6B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwayJ6dNzSlk3rERyx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgylY8k3Kta1enfgb8l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]