Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If autonomous drones becomes used in war qhich is extremely likely, then it has …
ytc_UgyJJkN7l…
G
If so many people are loosing their jobs, than who is going to pay their income?…
ytc_UgwIeb3Z6…
G
I think you should look into the workflows that hobbyist AI prompters perform in…
ytc_UgxZQbQT7…
G
I saw a thing were AI was asked to generate pictures of the evolution of mankind…
ytc_UgyxvRXnT…
G
@gulsum6084 Bro, just pick up a pencil and watch a tutorial it is not that hard…
ytr_UgzaiN6L3…
G
I HATE AI FART, BOOOOOO
seeing old paintings from the masters ages ago, makes …
ytc_UgxMdxWkg…
G
@johnlemon-t4cThere is a big difference between autonomous driving and programmi…
ytr_UgyQzN-nC…
G
I beta read someone's work for nothing. In fact I've offered it to several peopl…
ytc_Ugyh5x01k…
Comment
While humans will find something to do, that is obvious because survival is the core attribute of humans.
However the problem/power of AI is that it will shirk the workforce needed to do tasks. For example if a team of 10 was needed for certain tasks now only 1 will be needed with hands on AI. He will use AI to deliver the work of 10 people.
So an imbalance will be created. This will and HAS already started shrinking workforce in companies.
Companies are laying off after using AI.
Not only the tech guys but the general workforce whose work can be automated and controlled with AI will be left jobless. Amazon has already started creating warehouse facilities that will have no significant human presence. Robots managing with AI supervising.
youtube
AI Governance
2024-05-18T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxTa8jwx18X7klDLDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXFgUQYWBN9uZ_MyF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4NwKD7122qGFGSRR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_-W0NwgEKBDcuP7Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwBJeqC8TbX09fg5vZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyygVufHqfn5DCUY694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNhnHaslWbmMGYbOp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1kMBh2D37a7bjLJt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKpO8UZcvaTzO2rlB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkUwQLmTHgBtR1t8h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]