Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Amazon has invested billions in this ai technology but can't give drivers a live…
ytc_Ugz_0jKCs…
G
"For many, it [deepfakes] has raised concerns about public trust in politicians.…
ytc_Ugziv6P4B…
G
The irony of us heading into a AI-controlled universal basic income society with…
ytc_UgxtYp6qd…
G
can u make a video of image to video using an Ai model already created…
ytc_UgxTBE7c4…
G
Andrew Yang has been talking about this. Not only will these people have their j…
ytc_Ugz6E-Yel…
G
AI trains no different than a human artist ‘trains’ on other artists’ methods, r…
ytc_UgwibzRgJ…
G
Ai art takes publicly available or popular things and uses that for what it make…
ytc_UgyeNSzqI…
G
Why not make all the customers AI too? That way they can serve imaginary AI foo…
ytc_UgwPUGkuI…
Comment
Hi there 👋 if you find the impacts from AI overwhelming, disorienting and you want to learn more, I've been organizing a Google Doc with links to research.
https://docs.google.com/document/d/1BC3YSzxx5Lzvm52TSKe7VDRlyt3yEqoKZXa7E9gyo2Q/edit?usp=drivesdk
The document is structured in parts, overviewing the various concerns, opinions from policy makers and AI researchers, and the status of any relevant U.S. legislation. I hope it helps you. This should be a huge 2026 voting issue.
youtube
AI Jobs
2025-12-31T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwzbDtPgLze7Q2pJkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxssQJWkklC6iufQ914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxbfvaj9W07EbOjhpN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyL3EEfAZADzhpCL3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuvjIN6Wchfj9PSRl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMU7jUTh95RU7PtpR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxKrayjrSF61izlJq54AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxaNTvh9OcEiVK1dit4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1a6Q71eVL_3vqnrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwW8YNI-iYn03aju2t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]