Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use Copilot, both the personal version and the M365 version. I can tell you no…
ytc_UgyKhDGzs…
G
@28:21 ERIC SCHMIDT: "By the way, I can assure you that shareholder value is des…
ytc_Ugx4inJEc…
G
The real kicker is that AI isn't really AI. The creators of AI want us to believ…
ytc_UgxMvdlwy…
G
Like any lawyer, I have had to answer the question ‘What are the chances of winn…
ytc_Ugwp_GByr…
G
These companies are just using the excuse of AI to get rid of American workers🙄🙄…
ytc_UgwuZN6Oo…
G
The annoying thing about all of this is like all things, it could be used for go…
ytc_UgxeJpDTs…
G
Saw a good meme for this, "For AI to replace software engineers, project manager…
ytc_UgyDSFOfO…
G
Anyone seen the movie "don't look up"?
You think it's fiction?
We're governed by…
ytc_UgxRFaZPh…
Comment
I wish I was born in a time when all I had to worry about was feeding my family, raising and growing enough food and surviving winter. The lunacy of technology scares me more than anything because 1. I dont understand amy of it. And 2 I dont have any ability to stop it. There is really no need for it either. I lived without a cell phone for half my life. The world can deal without AI for the other half of it. Its like jumping on a train that only ever increases speed with tracks that eventually end. When do you jump off? Will jumping off kill you or only maim you? Or is it better to never get on it in the first place?
youtube
AI Governance
2024-01-17T03:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy-eA-2hzkwCIqtoS94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy2cVmVzza9htFKnVJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwS9F8aXJbng6TVtIl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzk1wDxbMFWZpMJZMt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw9ohCKs8KaLSO7fmx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyBKWBhr2qTMpLG9SB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwHz4LvRXJDzIpBvHp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxInfEyZmxncrlJ4wV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-I1Vlirs81LxOXBh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUqvnHVCbbMdV_U-t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]