Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So since AI is our absolute horrible downfall
Why then, why do we say all these…
ytc_UgwLOsw-0…
G
If there are no jobs who will have the income to buy the things AI is selling? N…
ytc_UgxPs8kfg…
G
Well, we can say that we tried to use AI LLMs to replace workers (in a simulatio…
ytc_UgzJZUX3b…
G
Waymo has tons of Lidar and Cameras, including remote people controlling the car…
ytc_UgyJkWwz4…
G
I have seen many of this one opinion, and i agree with it fully.
I want ai to do…
ytc_UgwJVzqNh…
G
Tbf the Tesla probably assumes if someone uses self driving they can’t drive on …
ytc_Ugw1TESsg…
G
@jeffcrume Yeah but it's not always accurate since AI models aren't foul proof.…
ytr_UgwyFOd_7…
G
C'était déjà le cas depuis plusieurs années avec les ATS (logiciels de tri de CV…
ytr_UgyhWxfsV…
Comment
What really freaks me out is that the people in the background look like deepfakes. This oddly even lighting, the telephoto-lens effect making them seem an inch too big compared to the congressman. The whole video resides in the middle of uncanny valley.
youtube
AI Governance
2023-05-24T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyFAdbHbs0MVQAjds94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTsRy-bB49HxDpvdh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyHVRYtvZZpdK0b2Dt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBrVAa4Spdg9oq07B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3HU4tdR1fggThEMt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWwEOoUr8kOszriIV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxMSI8vrFnbtMCXuDJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1lszndqdFBQ4w8rF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsoSmggmLH5wH1j9B4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyE9Q2PxIeJqtevMk54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]