Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I guess Sam Altman will lose his job to AI before anyone else. Am I right?…
ytc_UgzUaaH5S…
G
Can someone tell me why AI would be dangerous? Everyone is agreeing without ment…
ytc_UgwlINjXv…
G
I’ve unfortunately bypassed the vast majority 94.5% of ChatGPT’s ethical systems…
ytc_UgwRyavkx…
G
There is no law in nature that states that human's are defined by special things…
ytc_Ugw6Yk7EB…
G
It’s entirely possible that “larger AI companies want to monopolize the tech” an…
rdc_jmfnqzj
G
I dont think humans does not need any LLM’s for enhance the technology. Nobody t…
ytc_Ugxh28s8U…
G
If ai learns the same as a person why don't ai bros just learn to draw then?…
ytc_UgwJf12uR…
G
I am a Computer Technical Architect, but given the right prompts, ChatGPT can do…
ytc_UgxolkS0S…
Comment
Just for contest. 300 Billion dollars, are almost 3 milions yearly salaries, of 120k per year.
And every Company like Amazon, Nvidia, Meta, trow at AI way more than 300 billions each. They are just insane, they have thousends of employes, not millions, there is no way they will ever get repaid for this, and they would never have spent so munch for their employes, this is not even considerabile an investment, this is just madness, people that have so much money they can trow millions of years in salaries, just to play a horse's race on who is gonna ended up with the best AI before regulations that would make the development even more expensive and limited.
The problem is always one, the existence of crazy ass rich people, that can chance the life of populations by being greedy and dump.
youtube
2026-02-28T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyobxeH4W53SiQgnL94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDmJUZ5x-X8zRseqh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwu6b_jgaUSy3cqGKN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxD2DyRTJFFB7fH-aF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydqwJMaFxFtWUixox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVcYNqXdMA3GrV6p54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQe4nrYJgsT116YOx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzS7nZhHjluLsZWL1Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw3DyzQMIW3ojnnR0p4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWUPXCkxkTJLEv3R14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]