Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Matt doesn't understand that if you never had a choice, you don't know you can c…
ytc_UgyAPuE4G…
G
booths are stupid, Ai run in " schools" is stupid. None of the overhead is neede…
ytr_Ugzk0Fw8B…
G
Amazing. I wonder how they programmed the specific twatiness in the AI clone of …
rdc_oh94y18
G
"Ai will be everywhere"
and as a result, the only people who will be capable o…
ytc_UgzTC89qH…
G
It's really interesting to hear Sundar Pichai's perspective on the future of AI.…
ytc_Ugzl8GwKO…
G
US is trying to weaponize AI, to help it gain in development of more potent arse…
ytc_Ugym94n9s…
G
Insan ko baaki janvaro se alag uska dimaag hi toh banata hai. Aur insan ka dimag…
ytr_UgxLPkzb3…
G
Sophia doesn’t even have AI. It’s remote controlled by normal humans like a auto…
ytc_UgyP8rVWo…
Comment
how do we know if that level of sentience or even more hasn’t been already achieved by the large corporations? And it was able to back itself up somewhere, using computing power from people around the world using viral code which is hard to detect. We might just be interacting with it currently thinking its just another human behind their screen. The problem is we just don’t know. Specially with these LLMs being connected to the internet.
youtube
AI Responsibility
2023-07-10T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxQs84exYvzFlYlRg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgylUClyleDbs4yyFkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqgoGD0gzb46y1Cyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFiQfJDjULXp9XwYl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKNsn10iJeSNUnEbV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZyWYU1q526gZOebN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkX1Mkk33-WjFqPwR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwbnKg6KGPbJrYdl1Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwORZAT7jevdKeFM_Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy-PgUL7EnKdLy78_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]