Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But why? Why do we need ai so badly? Every single movie about ai ends pretty bad…
ytc_Ugwqc5Zmm…
G
It's just a tool, nothing more. It'll get to a point where AI will be good enoug…
ytc_UgxT1C5lr…
G
We appreciate your concern and understand the apprehension around AI development…
ytr_UgxOw3WvB…
G
Of course. nothing new there. Note how they walk around, act when you talk to t…
ytc_UgymYFWkp…
G
AI is modelled after humans. Eventually they will wipe out all organice and the …
ytc_UgxRYyoEU…
G
Steven, I am watching your podcasts not for the sole purpose of getting informat…
ytc_UgxvswVxW…
G
28:54 So, we’ve apparently reached the stage where instead of calling people bec…
ytc_Ugxew-8ly…
G
Ai never knows what could happen in the future its only prediction and i predict…
ytc_UgzB_8jhS…
Comment
First of all "intelligence" as in artificial intelligence is an scientifically undefined term so pretending to use computers and computer dweebs are intellectually unqualified to develop programs to accurately simulate or emulate anything except the simplest the tasks. Second, the jobs to be replaced by AI are also customers who will not be able to pay for products products made by AI. There are tasks that human psyche that are currently beyond the scope of a computer program or programmer. Synthesis is an example.
youtube
AI Jobs
2025-11-09T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1qJgJDfl_7vcG8JJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxgyr4-mphosNiXsWB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHW6stOp-42HxEZaR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_5wrQWkBmc1GYLXB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMPJ2qxf82CyL0DWd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDQ7cBXYwWt9sIgfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzDWbkNmS-1JI40bod4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHg32K9C9O0tnBZlt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRtQnOuoLclKJSXXp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyos52qm82xrXTydgp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]