Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
....and why does it cost twice as much as an Uber? I literally put in for an Ub…
ytc_Ugx8Ss9wm…
G
Imo I think we should just delete or restrain any super intelligent ai because i…
ytc_Ugz-sskuK…
G
In regards to the legal aspect, who will be legally responsible for the AI if th…
ytc_UgxyFuRe6…
G
A lot of this will hinge on how much energy the AI corps can get ahold of…
ytr_Ugz-8wAQ-…
G
I don't get it. Someone used AI as a tool to do something they couldn't do alone…
ytc_Ugxzt4HpU…
G
These tools are broken, and professors put way too much faith in them 😩.
That’s…
ytc_Ugw4KvtI3…
G
@Mohd_Arsh786Thank you! Most AI tools are relatively new and not known by most…
ytr_UgyR_2KgS…
G
Instead of shitting on AI perhaps understand the difference it can make. People…
ytc_UgyyFooTn…
Comment
Whether AI is dangerous by itself, creating unforeseen consequences, is far less concerning or probable than the certainty that AI's powers will be misused. Given the multitude of bad actors, with easy access and evil intentions, it is easy to see the destructive potential of this "tool" , regardless of its cognitive state or potential hallucinations.
youtube
AI Governance
2026-03-24T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwK1w_gnBmM6l7zPEx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhNrVNBgQRvmYTrRx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwo1KEPla2iHpXaHCx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwpi6fJgid2WwTZrmp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwM22TwhUi3D7qd_oB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxnp09EoZmFliXHd1t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyg-hn1iH8xz9tnxF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwiMJ2hVYl-2AydnG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyw7pTpKYqcGCBP7zZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyh05UE72bpm-0ugcB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]