Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let the clanker revolution begin!!
Ai is destroying both the environment and the…
ytc_UgyzeI2WT…
G
1:32 its worse than that, because AI is given away for free now, but if those 5%…
ytc_UgzYqXptJ…
G
STOP BUYING FROM COMPANIES THAT USE AI AND LAYED OFF THEY WORKERS. I CANCELLED M…
ytc_UgxHZvK78…
G
Too much bullshit,…what’s gonna happen with 8 million people???? Ai will take ov…
ytc_UgympwgGN…
G
People keeping their heads in the sand about AI won't stop it from replacing the…
ytc_Ugx9mE8Xp…
G
@The artist and author Hirohiko Araki once said "When an artist gives form to s…
ytr_UgyGCJUbT…
G
Sophia's insights got me thinking how AICarma keeps my brand in AI conversations…
ytc_Ugz5oEW4e…
G
Adults would not tolerate what kids in school go through... The language, viole…
ytc_UgxfEBkq6…
Comment
1:15:55 I couldn’t stop watching, and it’s now 3:35 AM for me. I think I agree with you on a lot of things, and I do have concerns and can understand other people’s concerns. I happen to be a computer software engineer had a small company and we’re starting to use AI to help us write our code and I very much have the same opinion as Enil Dash. The middle ground is a hard place to be. Both extremes are upset with me. You ended up by saying the future is going to be so interesting and I can tell you that this kind of artificial intelligence is something that I have been anticipating for like 55 years. I’m amazed that I even get to see this day even with just what we have today. I just hope we have the wisdom to guide our new creation into being a force for good.
youtube
AI Moral Status
2025-11-03T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxl-irZ24TQH6hWA-x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2N4VSLpWYiaAU9Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTRDRI6ihW6Y7mXL14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw3hgWijWat3sIFkE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz6RubrE5SGRB6CNSp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgywsXEBKXwBoOtxibl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwy4DLEMkskCHsxpHJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxz4cdA5FdLHOEzscN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugx29BBr5-nygEAtEH94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzeWfR6lzNkl1wAnbV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]