Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So we have the most outstanding entrepreneur in history, a remarkable visionary,…
ytc_UgwYDdnL4…
G
It does seem like governments are taking it a lot more seriously, and trying to …
rdc_hm7snb4
G
Where does he live? I would like to meet this Ai artist personally myself 🪓…
ytc_UgxmNKyMX…
G
Left is AI. Lighting and shading is off based off of a natural light/shadow patt…
rdc_oi0yzlr
G
Guys this is stupid
Robot r jyst programmed to do certain things
They can't thin…
ytc_UgxvFk-kZ…
G
“Not to replace doctors with AI” my butt! I’m sure that wasn’t the direct purpo…
ytc_UgzUVR79b…
G
I'm disappointed.
I watched Shadiversity for the historical stuff around mediev…
ytc_UgxNEFJH8…
G
Waiting for a deepfaked video of dhruv rathe
Saying Modi hai to Mumkin hai
RSS…
ytc_UgzDpZPbo…
Comment
As a software engineer working with automation I can comfortably say that any knowledge based job is pretty safe. Generative AI is really bad at most jobs. It hallucinates way too much and that is something that is just part of the technology, it can't be fixed without moving away from LLMs and any other generative algorithms we use today. For example, if a programmer is replaced by AI it will now take twice as long for a human to check the code and fix all the errors. It can however be a tool for existing programmers to increase productivity but there will always be a need for more programmers.
There's also this whole thing with the bubble being about to burst. So...
youtube
AI Jobs
2025-09-09T05:5…
♥ 228
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzCihuoQ389rd9LQhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1EWguQHgFcJ83P-54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4jJb_dyrbA7wFlNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzV7dOD1_hAfK0-1hp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJynz5XpqlNLOfb6N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"confidence"},
{"id":"ytc_UgyLI3HUOsrzpwFXKFt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"amusement"},
{"id":"ytc_UgwljaOiHZvCbChqzxl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiVz3_QJ1KctnE2j94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbiYOm3X2c638Uqbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBBi0Yo3TuzoWWvUt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]