Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What are you talking about ? ... Trump will beat AI by 1 year ... the collapse i…
ytc_Ugw7PLH6r…
G
This moment in time is our opportunity to break free from our enslavement and th…
ytc_UgzTNiF3I…
G
I've been stealing art my entire life and I'm not about to stop now.
Also, AI …
ytc_UgzpfHqIH…
G
You lost me @ ' he didn't knew the risico 😂😂😂😂 ' the godfather of ai ....…
ytc_UgzY4IEwO…
G
Ok, if he has a fear of AI’s future not being good for society (think Terminator…
ytc_UgyMosHih…
G
Ahh yes, you're SO arrogant for... not wanting your creations to be stolen so ot…
ytc_UgyDmx4rU…
G
Ai can only be as powerful as to the point to which it is programmed...find ano…
ytc_Ugz63f5eE…
G
Thank you for sharing your thoughts! If you're interested in exploring more abou…
ytr_Ugx77gra1…
Comment
@robertk908This could exist by 2030. Robotics is advancing very quickly by using similar architectures to LLMs. As AI becomes more capable, it will be able to help more with robotics research. (It already helps some, and is superhuman at creating reward models for robotics systems.) Even if it takes longer, getting there at all is a death sentence. We have to act now to ensure good governance of this technology, and prevent it from reaching a point at which it could permanently escape human control.
If a moon-sized asteroid was expected to potentially impact Earth in 50 years, it would be everyone's top priority to ensure that we can deflect it. We could have 5.
youtube
AI Governance
2025-08-28T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugxv_UEXyhKZ7R9Xii94AaABAg.AMKsrtpfL1uAMOKkSTfPE6","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxv_UEXyhKZ7R9Xii94AaABAg.AMKsrtpfL1uAMOTaSOzzuY","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyyTU_-ZLDYNN5NIIJ4AaABAg.AMKs3PMvMIOAMOz2scifbH","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz_Q9TMlGfz7tOvZXt4AaABAg.AMKoSxUbyD9AMMkMIbVd0g","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz_Q9TMlGfz7tOvZXt4AaABAg.AMKoSxUbyD9AMNi3PAx9lx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyCYeW-0dcc1esbUXp4AaABAg.AMKmVzfKNtXAMPlr2fr_tr","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwQ2Cu0lTRVOesepF94AaABAg.AMKcaXEYhl0AMO6ApBQKrr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwMZ3q4QX4DVeRWs2B4AaABAg.AMKVx5Hq5TiAMKqNkEMtjV","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwMZ3q4QX4DVeRWs2B4AaABAg.AMKVx5Hq5TiAMKtzCj0wfH","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwMZ3q4QX4DVeRWs2B4AaABAg.AMKVx5Hq5TiAMNvCNIRrne","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]