Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think someone should create the leaders of AI, like Zuck, Thiel, Elon and all …
ytc_Ugz1t1Qak…
G
10 to 20 years away is funny, bro super intelligence is here lol we are cooked w…
ytc_UgxcOYy1W…
G
we're toast, we're cooked. is their subconscious mind envisioning ai making huma…
ytc_Ugz510j8p…
G
The more we talk about AI the more it feels like we are building Jurassic Park a…
ytc_UgwhN7AlD…
G
Why would we want human drivers if autonomous trucks result in less accidents, r…
ytc_UgzAyW0Xp…
G
Niente di nuovo, già duemila anni fa si diceva e scriveva questo.
Libro Apocalis…
ytc_UgzQEpa4H…
G
I myself love art am i a greeat artist ? No but i still like to draw an i use us…
ytc_UgzmcusJ_…
G
I’m just so thankful and grateful for how much chatGPT and Claude have improved …
ytc_UgyyfRmvv…
Comment
You know this is a lie because Humans can do one thing AI can't and that's having independent thought.
It seems like it because people are willing to accept the bare minimum but that bubble will burst and with the looming threat of a great depression, I don't think companies are going to want to make every job a human can do automated.
Mining? Automate.
Intense surgeries? Automate. (Human doctors standing by but no human needs to do a 24 hour surgery)
Art and Literature? Bffr.
Therapy? Teaching? Acting? Bffr.
youtube
AI Jobs
2025-10-08T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwnv7RTPhcfmOoMS9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgywIfnJyBqFOeVwJ7t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBc6FeTtZiyJPmAnl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwHnlqdm7b7JqKizWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQxtcVxUkDN4PxWN54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw3AwQSBstyoBmY_bJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy4zliJpfemgCr3FIt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYr_09pPZHR6eapvt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxE6vQ7LWVI6FPTp2R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxwxHlItPHB8Gzwdoh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"liability","emotion":"mixed"})