Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The second issue is that AI is expected to take away most of human jobs in the n…
ytc_Ugwals-yy…
G
yep...Hal 9000 series.. def the worst case scenerio for AI. but humanity seems h…
ytc_UgxKNhKtP…
G
I’ve unfortunately bypassed the vast majority 94.5% of ChatGPT’s ethical systems…
ytc_UgwRyavkx…
G
The irony is I think AI could be used to bring us to a better world. We just nee…
ytc_UgyUXjjAu…
G
Companies that use A,I. Will face many hurdles to jump!! This should be heavy ha…
ytc_UgwDiO3k-…
G
And all of the art she made still looks so much better than anything i have ever…
ytc_UgyylpqcI…
G
That is not how it works. If you create something, then it is automatically your…
ytr_Ugyke_stc…
G
Only way to stop this is through local involvement and local government support …
ytc_UgwjUGpGz…
Comment
The other problem is that if you fire your human employees and then leave it all to your AI agents, then what's to stop the company that owns your AI agents from suddenly jacking up the price per agent? Or worse, simply just stealing your business right out from under you? If a company that loans out AI has "agents" that do 100% of whatever it is that your company does... then... that company can simply print it's own AI agents somewhere else, form a new startup, and simply just out-compete your business. It's not just the the workers who end up losing, but also the owners and eventually even the investors. At the end of the line, only those who control the robots will have any power, which will be boiled down to nearly just a handful of humans on the entire Earth. Owning stocks and shares or legal protections and rule of law won't make anyone special.
youtube
Viral AI Reaction
2026-01-20T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxEujdwGSYCVUe7RiF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7soFGPb1BWKa_nyN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwQTOlLNBIYi0ly_at4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5n9iRCEFEqj3NPjF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzZyLbohjpiw7Byll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwirjx6o-w58vl4Hm54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxhyuXWxSPytP4uVDR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-U99stBOK4AtKIgV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzW5J9zNTiDv0wUsSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwYTLLp73b0dnzhhN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]