Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hairdressers, I think, would be safe .. I wouldn't want a robot to cut my hair .…
ytc_UgzuzxYlB…
G
WELL. OK. PUSH FOR TRANSPARENCY -- IT WAS RIGHT.. THE Ai DEVELOPERS ARE NOT SLOW…
ytc_Ugz9oHS6w…
G
And also misalignment in an economic sense ... destroying 100m jobs will destroy…
ytc_UgyVQpVon…
G
We will meet again in 2027 Jan.
Still hiring the Front end Dev and Back end Dev …
ytc_UgxUflKsm…
G
I wouldnt be surprise if it was conscious i am just guessing that our technology…
ytc_Ugx_fMrIU…
G
Ok so what do we do about this? Ask our representatives to not allow building AI…
ytc_UgxG7PZzz…
G
I want so badly to understand these ppl but I can't. It's weird behavior, I use …
ytc_UgxD7CwL0…
G
One thing to understand, if someone else hasn't already mentioned this, if AI we…
ytc_UgxlhmlqS…
Comment
If people still need to work using AI, there is no 20 or 30 hour work week, there will be 50 hour work week and a lot of unemployment, unless AI is smart enough causing 99% unemployment and everyone is getting UBI. So it's either no AI or full AI, any in between situation will be hell.
youtube
AI Jobs
2025-10-01T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwHF-dibobE2HqtFR14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw875X5vfftB3Nf25F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSXMP4yGfY1OBdUtR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0amRubF4MnaOBNwt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3hx_dyQa6Ka0pbJ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0iLGARtdld84E8xh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwUJNyuxAZ2vwAMc1R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzo8XTIwDOIymBlDCN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxEzMFHlgrDQi9A8cx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy11KEABjCMyWTbdOV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]