Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah let's create something that when a robot learns something BAD it is stored …
ytc_UgwfSMV16…
G
At the end you will be slave to the ruling party if they are communist you will …
ytc_Ugy_OXe83…
G
AI does not really exist. What we have are complex deep artificial neural networ…
ytc_UgznJqZOv…
G
Imagine not calling Waymo’s support to let them know the issue and informing the…
ytc_Ugzs0HfN-…
G
Although i'm not good at art, i would much rather look at the poorly shaded, mes…
ytc_UgwSvkzhO…
G
honestly as an artist i dont care about ai🤷♂
like im a worldbuilder aswell and …
ytc_UgwQ5PSqw…
G
Like ya dude adapt...get over yourself your not fucking special...get it. No one…
ytr_Ugye2bZCq…
G
A.i isn't smarter then Us it just going off data we created but it is faster th…
ytc_UgySZhUpf…
Comment
I'm sorry Joss, but how did the only two people in this video that actively work in the tech industry, that are building these automated systems, only have a combined 5 minutes on screen? You don't talk to the computer scientists about solutions or even the future of this tech, but yet you talk to Dr. Benjamin and Dr. Noble (who don't code) about "implications" and examples in which tech was biased. Very frustrating as a rising minority data scientist myself, to see this video focused on opinion instead of actually finding out how to fix these algorithms (like the description says.)
Missed an excellent opportunity at highlighting minority data scientists and how they feel building these algorithms.
youtube
2021-03-31T14:5…
♥ 313
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzCvUw9N2Qf6TaJ8wN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyD8G2P0OCI17Quaah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwobBul9LqBuGR2ti14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgycKzStdbE_ICzUDzl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2gETSMTTllbGe6st4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfjJTOd0bckfPlg3l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzawiy5L9WG1U5wXWl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw7Jf4-yborCOl4-EZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTB6G1NbnsquasAxh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx7fSytD8SkBg2tRe54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]