Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Personally, I think early AI art had the biggest amount of charm. Sure, it was w…
ytc_Ugz_ZIDmR…
G
This is what happens when you let your ai use Twitter as part of the dataset.…
ytc_UgyFZzmkI…
G
They aren't creators if AI is doing all the work for them. They are just manager…
rdc_oh1coon
G
Wouldn't ai slop prompters be more replaceable? Especially since anyone can put …
ytr_UgxtA1ptG…
G
@kenbob1071 - Pilot inattention caused more crashes than automating airplane fli…
ytr_UgxylhOnk…
G
Even if you drew just a stick figure on a paper, that would be more talented tha…
ytr_Ugx-uyt8L…
G
I really hope folks kno it was 2 men fighting and they put the robot in the oth…
ytc_UgxrgAyJd…
G
There is a misconception (largely the fault of Tesla and Musk himself) that Auto…
ytc_UgwmJa2Kg…
Comment
Why does no one mention that a key reason they need all of those centers is to conduct surveillance of the entire population in real time, constantly reporting on their movements, and matching them with a massive database of personal information and personal histories? This push for data centers comes EXACTLY as the US is adopting surveillance systems used by Israel to do exactly that to Palestinians. Moreover, the integration with military resources would enable government to target people for assassination using drones or isolation from all kinds of support.
youtube
Cross-Cultural
2026-01-27T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx7DPX_jN4i2ZI9HYp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxOqjjj_dEO9ti4HDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwB9XmKf3EJ_dMJMHp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzJnf-Nf0Rq3a6VJN54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4gCGtifddfrz4RMp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzZHnPt-G4XaJGyz1p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxU8bPryd-yeq29fr94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx2GjZy_-o787RSOoV4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx1J8CFEq-NWzIBElF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCvIUzdsR--UcQ6SJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]