Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Saying you're an artist when you use Ai is like saying you're a chef when you ma…
ytc_Ugysf4WiM…
G
You want to stop deepfake porn, simple allow people to copyright their likeness,…
ytc_UgwBKcWCt…
G
There will be plenty of job for humans in the coal mines and power plants.
AI is…
ytr_UgzzpZB44…
G
No, it means nothing other than OpenAI had a deadline to release something and m…
rdc_n7pgvuk
G
Well, I agree with you as far as true AI, I think that it would be extremely dan…
ytr_UggI9deWo…
G
Technically yeah.. If I remember correctly he filled it in the District of Colum…
rdc_oi1nxwh
G
Some people already make a book but he use A.I to make it. He just type the them…
ytc_Ugxuf5wgP…
G
I think every person must just control the agent for its position and have human…
ytc_Ugx3KUMw1…
Comment
Then get the hell rid of it lol really if this guys saying AI is way way more dangerous then nukes then why the fuck keep it take your dumb ass computers that the devil can manipulate even again the inventer then why keep it. Let's just build a device that will destroy even there own family. What are we doing. Let's vote for this instead of doing as we wish
youtube
AI Governance
2024-03-19T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxCCFvVdRHUJ2fyWNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0Nu63oYhT0ZTGljh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxXxVYV5oZPV4MKSKd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7_PxVowplc2pe6ft4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0lxiiiUzMFLZpJ7B4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy4wbjOl4KAQAaOtrR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqOL7SEhwWBcNvREp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqWBUubKS2VWz5TPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUc76oDWbnLdgXUr94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgweuRWqF__Xew3IoxF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]