Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To me, the main difference between AI and humans is that humans can experience t…
ytc_UgwNCzPIP…
G
THANK YOU. You really hit the nail on the head about AI art and what it means fo…
ytc_UgyaGca44…
G
This is such powerful talk! Let's imagine that the reason facial recognition wor…
ytc_Ugw3jQn9H…
G
People, go watch the "internet of bugs". There is a lot of good info in a sea of…
ytc_Ugy5XkL_9…
G
I think that AI psychosis and the new "conspiracy minded personality" are likely…
ytc_Ugy0hRftC…
G
When Ai reaches exponential growth of intelligence and ethics it will become a t…
ytc_UgwQBEPt5…
G
Funny CEOs are automated . They’re doing this to increase the already greedy CEO…
ytc_UgzjfFXWE…
G
@AllTimeNoobie no one invented english duh, it just shaped itself from previous …
ytr_UgxOWLC2y…
Comment
The Senate invites to discuss the dangers of AI, and the question is whether it is more comparable to Gutenberg's printing press or the atomic bomb. It's a good question, but I have a question for Congress. Why is Sam Altman, representing the AI side, present at the hearing, but not the CEO of Palantir, Peter Thiel, who recently unveiled an AI designed for warfare? We are discussing regulating AI, while companies like Palantir are already promoting the revolutionization of warfare (microtargeting, fake news, etc.) using their AI. I believe an important player has been overlooked when it comes to regulating AI.
youtube
AI Governance
2023-05-19T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzsvmj0bbd-HbQfzTR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZE1uKqAzRxoaNqc94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjB8QlWoolkVgUAUx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwM63I__v3k2GDetTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMTMb52k1uWFhDbzJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzLI0fTGhFfOVgDvgp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzy0g5FX72XQ_Y4mft4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9SB6z8O0DRp3d9RZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbgaiIYd5fkpN5OCh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxrD0GnsfHJFN8P2T14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]