Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's already known it's just some pompous fool smiling about it on TV. Human and…
ytr_UgxERKJPP…
G
Yeah we will destroy capitalism. Maybe AI will do it for us. If it destroys us s…
ytc_Ugyjrie1T…
G
This is human error. And this human error can be the reason we die. The companie…
ytc_UgxXsrwfk…
G
I am far from a socialist, but this is the second video of Mr Sanders I see and …
ytc_UgwR70NR2…
G
Maybe the self driving car could notify the driver that the vehicle in front is …
ytc_Ugyo2hSnz…
G
A bit late grandad
The cats already out of the box
The horse has bolted
and t…
ytr_UgwLc-kRI…
G
I always see people talking about how AI is just taking in data and optimizing b…
ytc_UgyDPVdHG…
G
Is AI just a machine with known limitations like the halting problem and incompl…
ytc_Ugzsg8sUU…
Comment
when the greatest minds in the world can't agree where AI will lead, it quickly exposes the weakness of human intelligence. You merely need to look at the history of mankind, nothing but greed fueled wars over land, resources, and control. Now that the genie is out of the bottle it's only a matter of time until we reach yet another inevitable conclusion. If I were a betting man I would put my money on blackrock. seems mr. fink will soon own AI like he does everything else in the world. I think I just called the next bond film!!!!
youtube
AI Governance
2024-01-04T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJnXmJnUjI4BZw5cd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwpYg_nreZwNpknQop4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzjYp7Oaf7u7i6xvml4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzm3XB7nXqrSR1ypFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTPRPTwe106_KghyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNfYpULfh_Ir9Ix1R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwO_VzN-pF3q4Py3c54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwkgvHDi4UVDVYQ8Ml4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxm3y0DqVd6ftgejit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD2Yiu9gI7Z8nwxY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]