Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
0:05 calling them “AI tech Bros” is a stretch. These are mediocre people who nev…
ytc_Ugw9R5-IK…
G
0:35 Poor anchor asking live what jobs will be the first to go? Not realizing he…
ytc_UgxrNR_ty…
G
As an artist I don't really care about people using AI as long as they are not t…
ytc_Ugzg4TXAq…
G
people dont know how dangerous this tech is!!!!!! we need ethical oversight of a…
ytc_UgwgA_GkQ…
G
It will be far less than the obsession of Peter Kent Navarro.
I am not saying th…
ytc_Ugz6ksiyI…
G
These massive data centers and AI operations shouldn’t be able to grow unchecked…
ytc_UgwqT0_6f…
G
Tesla just won. Tesla doesn’t invest in Lidar technology they invest in big dat…
ytr_UgyRyjQ9k…
G
Turn out real issue is they can file a lawsuit and win but other job holders can…
ytc_Ugx2MpX1P…
Comment
Even if A.I. Is not smarter than the smartest humans, a world with average smart a.i. Who communicated nearly perfectly ( without misunderstandings), is faster more advanced and more powerful than a world with humans ranging from mentally incapacitated to absolute genius, who constantly misunderstand eachother. Or in some cases do not even speak the same language
youtube
AI Governance
2025-06-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgypYWZZ9kb12gDFn754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyc0n4OUNAsyyb9S8d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyeT5Z5aZz_vEoDHdR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwyd6KqjT4JCiaN-Md4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwtwr63qsE5Bcx5XcN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUuRsWZXIX479vExF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxsN73NdV5NEDuYVmZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXZvX28nLmjP4B0tt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJWRTmdZTS7lB4D_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyx9bB3vAE9EEKlk1F4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]