Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
11:49 I really can't draw but still refrain from using AI art for absolutely ANY…
ytc_Ugxd8gJpY…
G
While I’m still a Robophobe- I gotta hand the win to AI on this one…
ytc_UgzFDHuxV…
G
This is the reason for the indoctrination of the youth today, separation of old …
ytc_Ugx7dlfj7…
G
Ali was sleeping? Cause I could see the semi intentions 300 feet before.
And, a…
ytc_UgxAPrIdj…
G
It sounds like you're reflecting on the deeper implications of AI and its relati…
ytr_UgyYwpezZ…
G
Hey there! Yes, it’s quite surprising, isn’t it? Sophia’s insights into wisdom r…
ytr_UgzoTBJJM…
G
Sabine: 2:18 *_"these people... have no idea how LLMs work"_* Ok. Here's the …
ytc_UgyJtrhLu…
G
It's kind of arrogant to claim that some human-like computer with no living body…
ytc_UgxDZ-0Pw…
Comment
I worked with ai back in 1999. It's a mistake to think it will only replace humans if it is competent. What happened in tech support was that it replaced most techies because it was effectively zero cost compared to human employees.
So imagine if you are poor, and you are accused of misgendering someone in the UK. The police arrest you and years later you have a trial.
Do you get a human lawyer, or does the State pay for a totally independent ai that coincidentally always tells the accused to plead guilty?
Don't be paranoid. The State would not save money at the expense of the citizens. That's crazy!
Given a choice between a computer program that gives out five million pieces of bad legal advice, and a single human, which do you think the State will pick?
The rich will still have lawyers, just as they have armed guards and private jets. You? Will have the right to plead guilty.
youtube
AI Responsibility
2023-06-14T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy8fSXTAAcLUBhKJ5V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzrNE6_jSVcYpID1Ch4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwORgEYZGh0IQXW4jV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlCp7EngYhDQ3T8rh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxc3Iy0wJE1z9EMygJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxB9U2aF8WNaeGYCwR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw49GMtyc371L4mpsZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYWSwuYwr58A1KtXZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWnsdQJIDZtRS5H-N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxyKoB_uyhtR97AeHJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]