Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The abysmal quality of their writing, to the constant blaming and accusing peopl…
ytc_UgzcpHCNf…
G
AI is getting smart in such a way that humans will not be able to compete.”
That…
ytc_UgwvUIIsA…
G
I didn't even know such a thing could confuse ai. Not sure why people would put …
ytc_UgyuTrXSV…
G
It's not just math or an algorithm is it though? It's questions written by biase…
ytc_UgwEeWXib…
G
I really hate the media and its fear-mongering. Imagine saying things like this…
ytc_Ugz9l7Za-…
G
This was great. I’ve always hated that the only reason people seemed to dislike …
ytc_UgyN_V4_j…
G
worst advice given is to pick up AI skills
whatever you pick up will be automate…
ytc_Ugx6QFlLZ…
G
The two major issues not being addressed:
1) Huge numbers of people will be vyin…
ytc_Ugxot0OQi…
Comment
The "Chernobyl of AI" concept is scary, but the "Subscription Fatigue of AI" is already here. I respect the safety debate, but practically speaking, I just need tools that work without costing $100 a month. Found omnely recently and it’s been a lifesaver for accessing the major LLMs and image gens under one cap. At least I can save money while we wait for the robots to take over.
youtube
AI Governance
2025-12-07T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylOcMtmfYPRLyA_uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz0s_5F0fL7Yc6h9pB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCTFAC3tuaqQyd4rJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmHU68lQswaDEmhOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9c5M8aiFACFvwDkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzslRkuK_KSVVjq6CV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxqiq20CC4lLEtT6Oh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAcj9D3tb7hktFuIJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUI-XmKN6ijviTF6N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW72yvXfzgauYvOVF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]