Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s interesting to see how you draw parallels between Sophia's thoughts and bro…
ytr_UgwvIwhKL…
G
The thing is - AI replaces SO MUCH in our life like nothing beforehand ever did,…
ytc_UgzVJZpsd…
G
I tell everyone when the AI overlords take over, hopefully they will remember ho…
ytc_Ugw3jL-F8…
G
I make deep neural networks with data in highschool I build I level complex neur…
ytc_UgznYipFI…
G
5:29 well. Elon Musk is a complete moron, so I'm not too worried about him being…
ytc_UgwFFskSS…
G
Some goofy woman on stage singing?
Would that nicely Burning Man idiot
playing h…
ytc_Ugzt2yzug…
G
Sorry I am late Tucker. I completely agree with Elon on this point. What is trul…
ytc_UgxOUju_v…
G
This is limited thinking, people adapt, we are creative, we connect through medi…
ytc_UgwoumtwL…
Comment
Thankyou for your dedication. I agree we could have developed safe AI. I was educated in the 80s from retired military professors who focused on thorough testing and responsibility to Users. In 90s the new young crop of IT co-worker's comng in with Object Oriented languages never tested and worked out problems in production. There was a huge clash in IT cultures.
youtube
AI Governance
2025-12-06T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyzYVs_gZ9xLb54dch4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeTOEGcQihTdpjj3t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyhB3KI8wbtGGsIkfJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxoEsiFn5Ys_-V_irZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjpyDwTcmhQpZM6NJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysNLNHuDu1VySx8Td4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzrhJ1l1kl12HgPahh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPQrEYoHEX6PJyW6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx51P_nqHFCb_IT8994AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyEPnuxNpj7hZsbS1V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}
]