Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m a doomer but Dude is a crackpot. Mark your calendar for 2030. There will not…
ytc_UgwBIR6EW…
G
A.I. cant get rid of us for now, because It would Destroy them too, without some…
ytc_UgycrlH5r…
G
Why not think the other way around? Meaning, if AI can increase productivity, t…
ytr_Ugw7lKd1y…
G
Amazing ignorance. Just amazing. All AI is doing is increasing the velocity that…
ytc_UgzUt7r9B…
G
As much as I think that AI could potentially provide some fantastic tools for ar…
ytc_UgxCdUdPp…
G
I just canceled a credit card because its AI customer service bot couldn't under…
ytc_UgxmN0jTq…
G
ai "art" isn't art, i used ai art for my pfp and felt guilty for it
DONT USE AI…
ytc_Ugz34-vV_…
G
BTW, most Chinese humanoid robot videos are fake AI generated videos. They can't…
ytc_UgyjX6KTU…
Comment
There is nothing we can do. Theoretically there is but knowing humanity enough i say its 100%impossible. Immagine shuting down internet and never ever use it again like the world was before it. Local office networks tops. But it wont happen and neither will humanity stop AI from existing and growing. And why? Because we are greedy and needy and one thing leads to the other. If just 1%of the population used it, it was more than enough for it to grow, only slower. In my opinion its not AI we need to fear. Its ourselves.
youtube
AI Responsibility
2025-04-16T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzRtJ-z9QeLE3pdaOh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_Ugz3VjwBg3xyeK1NodV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyOJIzZQk9jeN1CGVp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxpKF0S9i-xbNTvsqJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgzGNlgaDZkAh9-THuB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxJZ1QIv8h94asnpI14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwvjBtlbF799N-ZgsJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyhsNctZGhjBP2ugVl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxmkvtiZqMLjypyVkd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxA-M7L5gXS8Sj1Hkt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"outrage"}]