Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ChatGPT is more than complient, can't even talk naughty to it without it trying …
rdc_jftgld1
G
. . . . . Looking at how the world as it is now, feel like AI couldnt run the wo…
ytc_UgxVXW4ev…
G
A.I. must be heavily regulated, and agentic A.I. must band. International treati…
ytc_UgzZyNVIL…
G
Re:
Professor, it looks like your email was written by AI. I can tell because i…
rdc_kgqrtal
G
Lol, I call "copilot" clippy all the time. In fact I should see if I can train t…
rdc_nlzhssu
G
Good thing that most ppl use this AI to make Obama play video games now…
ytc_Ugyc5zAun…
G
I think "AI artist" is exactly the same as someone hiring an artist and then cal…
ytc_Ugzq8I_ZH…
G
If we politely ask the AI to contact everybody who's data was used to train it a…
ytc_UgzK147Mr…
Comment
Everything he says about disappearing jobs, around 40:00 is 100% similar to what I've heard in the 80's with the second automation revolution (the first automation revolution having been the conveyer belt). Jobs didn't disappear in the 80's (though at first it seemed they did), the jobs changed character. Suddenly in 90's we needed endless amounts of people to code. Those new jobs were created back then. In the near future, most coding might be done by AI, but we will need a huge amount of people to check and train AI properly. At last, for creative processes we wil always need people because AI will only create studf within the existing training dataset. Only humans can create outside of that box. Let's not forget that having knowledge, like AI has, is not the same as being intelligent. It should actually be called AK, artificial knowledge instead of artificial intelligence.
youtube
AI Governance
2025-07-25T15:0…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCQpTtIyhmspKxf2V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzE30HAQluEyeGI76N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSKdDu-d0ICscxxa14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJEgqRWMfXhDW3_v94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugya8v2yMMt97FfcZ6B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySE7v1HeRT1kHCJJV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5u-yAazoHemX9Ob94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbL2un8iU8RthS7WF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhT4v_y3p7DhfdMJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyJYsIe1ixFEq2L9HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]