Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Strange people in the world. I live in Chandler and see these vans everywhere, a…
rdc_ecypl08
G
Not me watching this as a break from spending the whole day on AI 🤣…
ytc_UgztH6_Ac…
G
Hey im a disabled artist! Im anti AI and people thinking we cant draw sucks!…
ytc_UgzARZTn9…
G
I have used AI probably a handful of times, I try not to use it…
ytc_UgxSFi3-n…
G
Ahh, that makes a lot of sense. I’ll admit that from the article title and such,…
rdc_famcwsw
G
We were promised the world and got a backyard shed's worth of surface area. Only…
ytc_Ugw82BkYF…
G
sine and soul are two different words, however they're the same string length. S…
ytc_UgxwQPz03…
G
"google engineer" neck beard basement dweller who falls in love with a.i. during…
ytc_UgwJnxrPT…
Comment
The level at which AI is developing really is a bit scary. Like, 10 years ago I wouldn't even have dreamt of having a proper conversation with an AI. Yet now I can download ChatGPT and talk to it as if I'm talking to a real person.
Heck, just look at something like Neuro-sama, the AI vtuber. It's almost like a real person. It can sing, play games, properly respond to chat and it's creator (Vedal), and it even asked to not be shut down every now and then. Yes I know current AI isn't actually sentient. But it's scary how close it's already getting to sounding and acting like it is.
I honestly don't think humanity is ready for true, sentient AI yet. And with how fast AI is developing there's no telling how many years we are from such a thing becoming reality.
youtube
AI Governance
2025-07-10T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzKmqcp8zaguJpaVY54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwo2rvJBpJcPo4J0j54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0uKxAzcywOlDs2UZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy92j_6QZ2YcUfLMxd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz7ACFl34UJ0hfH1Wx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwJOUn7pnCl76enpe94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw9qETBIgAOVsASh6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwicOSSVyAy3PVw90l4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylRBlWIRbkCTvsTJ14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy0AGUvAa2BO-IFjvx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]