Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Niel doesn't have a problem with human level reasoning ai because he's already r…
ytc_UgyeDXD9x…
G
Good thing that finally, someone for this you found on your show. AI is dangerou…
ytc_Ugwkix3jc…
G
Two things cannot be true at the same time: AI is overrated and AI is going to r…
ytc_Ugy5ONqZs…
G
@melissablanchard9473 So you would rather have it alienated by humans? You don't…
ytr_UgwV0BSzo…
G
this is why I will NEVER accept driverless cars. Safer than human drivers? NOT…
ytc_Ugz_DTmdq…
G
Enjoy paying doubled utility bills to fund these billion dollar companies and th…
ytc_UgxSioDZF…
G
Musk has a very clear moral compass. He wants to give humanity a future through …
ytc_UgwUhvcHN…
G
Uhm... You need to explain what you want look at it in your mind and artists use…
ytc_Ugyvxaihi…
Comment
the next step is simple - take the same transformer idea they use to build the current crop of AI and work out how to double it, to make the equivalent of the dual hemisphere found in natural brains. train one side to be fuzzy and emotive, and the other to be as precise as possible, and then work out how to link them into a co-operative whole. i know these are layman's terms, i also know it should work, because it is how we work.
youtube
AI Responsibility
2025-12-17T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzAyB4UkzQDPD1dT5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyJdQc7VpWLV7Obpax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyWyASXvzut5TnPQrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy0gvvEv-uNvolHgHZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugwd7PHqixPGiU76k9t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"frustration"},{"id":"ytc_UgyA-KhnhLpiJ1ZyC1h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugz_D2Nnqh5G0siPE5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwFJbtsw0d_mVbzX5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgwGvMj1O2A0X7sefx54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgxJo3hRkgUEbNPDovl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]