Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This shit, is fucking scary. I mean, really. I don't get scared about a great ma…
ytc_UgxkDWkBA…
G
ʙᴀsɪᴄᴀʟʟʏ people think it's ai who did my work.. I was known for my skills in dr…
ytc_UgyZEjVWK…
G
19:07 doesn't pretty much all dictation software use like neural networks and sh…
ytc_UgzTa0iwh…
G
Just had a conversation with Manus re. AI alignment. In the video you state an 8…
ytc_UgzdBqMB8…
G
I think that in todays world a degree is not enough. I mean sure it helps, and I…
ytc_UgzghpU9T…
G
Ahaahaa!! This is great. So now the list of things that are racist are: math, ut…
ytc_UgwbF_Exc…
G
People keep bringing up “inspiration” in relation to AI… but as far as I’m conce…
ytc_Ugzny27x1…
G
depression, suicidal thoughts and easy access to a gun is what killed this man. …
ytc_UgwOd2nID…
Comment
Hi Ameer, in contrast to most of the commenters I think the topic of UBI is a very important one. Most people have not yet acknowledged the impact general purpose AI and the accelerating automation will have on the employment rates. There will be new jobs created, but vastly more will be destroyed and millions of people will be left behind. In my opinion it is still way to early to implement UBI but in a few decades this is a valid option. Bill Gates and, if remember correctly Eric Schmitt also, talked about the concept as well. People have vision for tech but not for society and idealism is a swear word. Thank you and Floyd for the Video!
youtube
2017-07-09T20:1…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjHcZWa6yC9k3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj5qulzb1Gb3HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjsAJB-7F21tngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgibXYnMDpQjs3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugh2oFcuduKUvXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghRq5GaXdYkx3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghDjQJUIOV6h3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjsFjz1XgEz43gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxuQRddpMtPo9sxjdl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxgjh7wXunf8NOv9tR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]