Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But what would be his motive? If openai fell off the face of this earth tomorrow…
ytr_UgzFpUBoB…
G
Keep your AI technology, The servant of Megiddo.😂😂😂😂😂😂😂😂😂😂😂😂😂. Stop your stupidi…
ytc_UgzL7_dLo…
G
ai art is useful for sure, but the issue appears when people argue using it as a…
ytc_UgyRV5wJ9…
G
@blueclocks7610 It’s 2025 most kids know about AI
A picture of me riding a drag…
ytr_UgzqWtLV1…
G
Very flashy video and all but if you think that in the USA any party (be democra…
ytc_UgxnZguwo…
G
@Stranglewood they consent to it being posted. They do not consent to someone ta…
ytr_UgzjgYuiQ…
G
You do not need to throw anything out.
First of all, it's OK for you to form yo…
ytr_UgxS_Jsre…
G
It needs to fail. The suffering we would go through in an economic depression if…
ytc_UgzZmRwbK…
Comment
If the news stoped focusing on the bad and focused on the good of AI I think people would be less apprehensive of this tech. Yes it’s dangerous and so is almost all new technology but think about the good it can do. It can be trained to look at massive amounts of data with cancer patients increasing the likelihood of identifying it way earlier than a human doctor. It will be available to run billions of simulations to develop a viable cure. We should more cautious on how we regulate it so as not to staunch its development and potential.
youtube
AI Governance
2023-05-20T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwRr3etHRtaby2Er2l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxH40Ew9BwIEFJT5nF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOrY41FIUXnkW6BWZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxLTLfun6GNiz6fjPt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWCR0-UlkiVoxy4QF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgydBKl03RGcqiMMsPB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0sFPYMMhfPlxd6Xd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjumGn4C3ez6q_tlV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwB8sUGs8J7obSSP3Z4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjTFDRDLDxN8DLT0N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]