Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pretty soon we’re going to live in a world where you can just tell AI you want t…
ytc_Ugycp9hGj…
G
it seems to me many people hear it, and accept it is not good, and are not able …
ytc_Ugz-SyrYB…
G
@Christopher_Gibbons you think humans learning is the same as ai? Like are you a…
ytr_Ugz0VV4bW…
G
It’s nice to hear how one-sided this conversation is about pollutants especially…
ytc_Ugz3Hc8tp…
G
@gabemissouri Yes, and? Like I said, AI is going to enhance those jobs, not repl…
ytr_UgxfBxo_J…
G
@samdoesart how do you feel about using gen AI as a reference or for inspiration…
ytc_Ugx-I827w…
G
Easy to pick on Tesla but how many non Tesla cars are driven into emergency vehi…
ytc_Ugxh50gLs…
G
If you want to get into computer science and programming, DO NOT be scared of ge…
ytc_Ugz1FTRhD…
Comment
50:53 how did humans come about then? What's our "outside scaffolding"? We were just atoms, billions of years ago. And now here we are. And AI does have the scaffolding, that's what us training is. It has a growing speed we didn't have as evolving biological beings. It likely won't need billions of years. And it can be dangerous to us long before achieving that kind of intelligence or self-awareness, if ever.
youtube
AI Governance
2026-03-23T14:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyFaPo5YEWO9sgresx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzRLgvTqt9ey242tCR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8d0Js_dXIzGicbSZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzRRVCgkkUw2nFLUgF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzA7PO0_1cxTs4EzvV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8X2JzGTkDZ77pPkB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyJI14QOZPtdpR5sgh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpzCnzAEDI-ghOi4p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugycg-hd13Bv_7P3CFN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxR-dZBHirmy3z52eB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]