Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Confused ".. When AI starts take over, no more mistakes would take place .. ro…
ytc_UgxM7B5YM…
G
My true fear is that AI is already passed where the public believes it to be and…
ytc_UgxkEqlMm…
G
не понимаю на что она жалуется, не нравится работа, увольняйся какие проблемы. м…
ytc_UgyIqmL-t…
G
Remotasks & Outlier are bringing data slaves to Scale Ai, which in turn feed Ope…
ytc_UgxTXMCVB…
G
ChatGPT is like a high school kid writing an essay. It’s an amalgam of other peo…
rdc_j8axmgg
G
charlie you own the copyright to this video and yet there is no skill to what yo…
ytc_Ugw_B5PmY…
G
Hey, is I may add my two cents. Having AI make stuff for you is bad. But making …
ytc_UgwY9Ror3…
G
+1 for this; AI is a tool; it either boost you or kills you. If someday AI manag…
ytr_UgzavelR7…
Comment
There's no robot in the video, but at the end of the day, AI wouldn't need to be a physical entity to destroy humanity.
It would just need to destroy our infrastructure through being embedded in our technology, and humans would turn on each other when the chaos ensues.
I find this more concerning than the "killer robots" you see in most sci-fi movies because it's an enemy you can't see, but it is all around you.
youtube
2025-11-01T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy76Gnp8Y62vtSqxVd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx8Ey4haO1i2hHhDVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxoR0Qe7qaaGPauBt14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzRizkmtIp2AZ5RkQl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyCOF1BRVPY8e-1Iud4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgylK_f-xoEicfQwBd94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgyJA7Fmd070-Ma2Ar54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgwFfJmAh7dz2wf-EEV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxN1Y98hEBMWQ6FE5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwxVg1T6e2_a1DOkMd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]