Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Works as a nice loophole for their customers to use too. Program a robot to do s…
rdc_dy5acof
G
I just tried this on chatbotAI abd it said it doesn't do role play, especially i…
ytc_UgwbSIXNp…
G
I never used AI in my life hahaha I wasn't allowed to do that in my time on writ…
ytc_UgwmWk_98…
G
There should be a disclosure at the beginning to say that the speaker has an org…
ytc_UgxV7slL8…
G
Ok, I haven’t watched this yet but the headline, only 5 jobs left? I just don’t …
ytc_UgxD6znPk…
G
Good, if everybody is getting fired because of AI, nobody will buy from Amazon a…
ytc_UgyYbC9jN…
G
Then do it. No one would bet an eye. Make your own and train an AI model on your…
ytr_Ugyb-rGFb…
G
This is NOT true. AI is a parlor trick, companies are learning quickly that AI i…
ytc_Ugxl3ubRB…
Comment
This scenario has a great problem: 1) The AI today is not good, it CONSTANTLY makes too many mistakes and constantly can only do 'punctual' parts that cannot match by being said "in parts" 2) Something no one is realizing, if you give the job to so many AI's you are only training the AI company you are hiring giving them more and more power, what if after that they put a company NEXT to you wil 10 times the amount of AI's you hired? how you can compete with that? in other words using AI is teaching your OWN enemy how to destroy you. because the problem is you do not "OWN" this agents you just hire them but the knowledge they gain is not yours. If the CEO or boss of any wor kdoes not realizes this , he really DESERVES to go extinct. Please prove me wrong.
youtube
Viral AI Reaction
2026-01-05T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy23qcI_VKcSMNc_7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgybXAVtZV-AAW0mGcZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugwckec4IPE-vzUMnEV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzETI22O969ThXH51l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwJj1Aym_INMkKjJg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyv9MbZmsPgBwU_IXx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzsfquw0nFjLuk4m-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyOL4q-0zcifc1L9W14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhVvhAOzQlAcfjtHh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyV7GxwTPMF8M6xG8J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]