Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not siding with anyone here, just want to make sure the video has the "lates…
ytc_UgzjYbxLT…
G
Yo @LavenderTowne, I have genuine question You see, I have many ideas written do…
ytc_Ugy53Vg5K…
G
I mean you know how easy it could be for AI. To create Google phone numbers and …
ytc_UgxknT5Km…
G
I don't think a non conscious AI has any motivations. So yes, if they can become…
ytc_UgzlFdKnJ…
G
Actually there's a ton of positive news about Africa lately just don't read west…
rdc_et7h9mm
G
5:22
"AI can generate codes
but it can't navigate ambiguity"
This line might h…
ytc_Ugyfx6js0…
G
I've been a graphic designer for 25 years and saw the writing on the wall with A…
ytc_UgzGUv7eW…
G
@Dannybd112 Ragebait. Otherwise give me an example of prompter spending months …
ytr_Ugzv95-SR…
Comment
It's stuff like this that makes me think UBI will be an inevitability. Sure, we need things like maintenance and public works. We need emergency services and researchers and defense personnel. We'll need people in mechatronics and healthcare. Mining and a ton of other essential jobs. AI is coming for a ton of white and blue collar jobs though. Just like you didn't stop the robots in the factory, you're not going to stop the robots taking over the driving jobs as soon as it's deemed safe enough. We need specific laws protecting human life above property damage in regards to AI though. We need to hold those programming AI and directing it's creation to be held just as accountable as if they were driving the truck itself if the AI was found to prioritize it's cargo over human life.
youtube
AI Jobs
2025-05-28T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAElhukJ2v-nT92rd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxFvnyb34gzZNWeyoV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9dKaTinQKv8tzjyt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzY8Xo_6WsrMwpkH9V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqAbyI5jyDd_X3yih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyFPefiyHrl6yfvsaV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9s1gXyZkVTEUTl0Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI8jrO50IyfoQa6Bl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxYJIWMunV3YDJwTd54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxB91QaKRkZ-EZgxBh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]