Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If it's not possible to find the source of the art used for training AI, then do…
ytc_UgxDlAZ0l…
G
i genuinely want to die why are there so many people defending this deepfake shi…
ytc_UgwWcEhcc…
G
I just assumed they added AI specifically to make it more difficult to get custo…
ytc_Ugxyr4DmS…
G
where are all you senior devs who confidently told juniors and college students …
ytc_UgyDnDjz6…
G
The first thing you should understand about AI is that there is no awareness or …
ytc_Ugzyub3C3…
G
The reality is these types of deep fakes have been around for a very long time w…
ytc_UgxE6u5Zo…
G
Of course we can turn AI off. We can pull the plug to all data centers. We simpl…
ytc_Ugw1Mq4wj…
G
The ai disturbance actually kinda looks good in my art ngl🤌 it adds more detail…
ytc_UgyvkZDbn…
Comment
The problem with this bullcrap is that car companies have pushed there "we're already 80% of the way there". Well it turns out that it's those last 20% that are really difficult. Badly designed software has already killed pedestrians.
The other bullcrap is the idea that shared self-driving cars will take a bunch of cars off the road. No, they won't. Most people have a bunch of stuff in their cars, stuff they'd prefer not to have to remove and put back in every time. This can be tools, ties, blankets, baby and child seats, CDs, loose change etc. They also want to be able to take their car wherever they want whenever they want. Sure, most of your travel might be to nearby places, but with your own car you can suddenly decide "hey, let's go to grandmas place" or "let's head into the mountains" or wherever it is you want to go.
So no, we are very far away from driverless cars and very very far away from people getting rid of personal cars to get essentially cheaper taxis available.
youtube
2023-12-30T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxtKDtv3Jw3lM99MgJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDTzAgHFb4vzF1O4x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxl1IHTFwB8tP-or0h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx7I9C4GzgSv253Wkl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAJpCp_daMyj29yHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlHzkm1N7D2epXx_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzROjb7y0nnit_DfQp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWMuA2NvJf02Q1LXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPe31ytx3IGuZ7JQF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlROqbXiwPRqYBv4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]