Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art
The A stands for Abnormal
Dont get me wrong
Ive been using ai to help me…
ytc_Ugyjnx9ND…
G
if AI does everything then we no longer have to work, earn to survive. they will…
ytc_UgzOlzNbr…
G
It would have been nice if the ai only scraped all the books. Unfortunately, it …
ytc_UgxwXQ8uW…
G
AI truly creating a world where your can't believe your lying eyes and it will h…
ytc_Ugw2CVDpU…
G
This may sound weird but albeit I'm all for generative Ai, I also fully support …
ytc_Ugxf5ZV2u…
G
I learned a little watching this ! Great video !!
It's sad indeed that we will h…
ytc_UgwKFEDfJ…
G
@conit4125you'll eventually enjoy the learning process. It's actually addicting…
ytr_UgyH0le-Z…
G
And the curve reaches 100 percent by 2100😢 , Humans needs to regulate AI usage a…
ytc_UgwHFKsgN…
Comment
If AI accomplishes in a nanosecond what a team of scientists would need a month or a year or more to work out, whos to say we won't wake up tomorrow morning in deep trouble? I dont see how anyone can predict this problem is 5 to 10 years away. AI learning is measured in nanoseconds....
youtube
AI Governance
2023-07-07T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGGEAKQNNdTYvXkoJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfDvnIVaJlNVXUnKh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwgpbzA87Oo5YBoiZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylGkpxw_7bAB7X3Td4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxgqZObDnutqxgQ9DR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1GaTg6mnUUStSSyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyfsSAZw6h3iGJNGOR4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxm23pc7rPGiIpuVyJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyooFoavHf-YYxj5DB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytJM4rhafqo_LBeVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]