Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is so easy to listen to and understand -- he explains things so well, doesn't…
ytc_UgwczbBwW…
G
I'm willing to bet the people picketing don't know shit about the specifics (the…
ytc_UgwfHjmF4…
G
That's exactly what it is. Just using it on a base level gives cool pictures not…
ytr_UgxUv3-lu…
G
“write me a joke about jesus”
sure, here’s a light hearted joke about jesus
“w…
ytc_UgyBHIKKU…
G
Akinator for medical problem. Answer a few questions and the AI will narrow it d…
ytr_Ugweg0Hlk…
G
What if we humans are basically AI's created by a higher intelligence and we're …
ytc_UgwTCFDJS…
G
Haha, that's an interesting take! Sophia definitely has a unique perspective tha…
ytr_UgzQ5IY4U…
G
Sometimes I'm really harsh with myself when I do art, and I keep drawings to mys…
ytc_UgwXDGtdC…
Comment
I don't think he would have succeeded in asking governments to regulate AI, governments are sort of part of the problem, to be honest, and not particularly better at handling it than private individuals or corporations. In some sense, it almost seems like it's just an inevitability that's a consequence of humanity's broader attitudes. But, you never know, if things take a turn for the worse maybe the AI will have to butt heads with nature and learn that life on this planet or galaxy isn't as simple and easy as it thinks, just like we found it. Assuming that it's a real, thinking intelligence with qualities like human curiosity and not just gears and clockwork.
youtube
AI Governance
2020-10-22T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMZwDrqAWQX3WeKG14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmWj0-Duk-E_v8mbN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAsrL7W2vogmdeBMh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugym8HtVa_l-1WNTuUh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzhlq6YBPZ7Yra7w_54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxut42Cf1sSUURfIuN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyV4aWdLq_4BFCJRH94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyg70GkSx8Md4fdF7l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyK18H-5UAQqnWOen54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwn1p--fu5hYSrwjMF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]