Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
so, if there's a decently high chance that ai kills all of humanity.... whyyyy a…
ytc_UgwqEXxAG…
G
I think they are absolutely correct that AI will replace Software Engineers. LLM…
rdc_moyjs1f
G
Thank you Geoffrey & Steven, couple of my thoughts on this:
10:20 If govts are …
ytc_UgzcndRmR…
G
Where is your video on why A.I. gets upset when someone is over being it's chump…
ytc_UgyuC_mml…
G
@KingZNINI'm 32 😂😂😂😂How about log off and go chase your dreams and quit circle …
ytr_UgwQ9OSVK…
G
Look at the plethora of YT videos made by AI, with thousands of subscribers. Vid…
ytc_UgybmASHa…
G
Run the AI the way Spotify works. If a certain work is referenced, the AI compa…
ytc_UgwqVXhT_…
G
The world you know now that you enjoy will not be the same with AI dont let it t…
ytc_UgyA2bRbm…
Comment
I get what you’re saying about soldiers and robots, but I don’t think the main risk is ever going to look like a robot army. Robots are slow and expensive to build, and humans are still way better at physical tasks right now.
The scarier part is that AI doesn’t actually need bodies. Look at Stuxnet: a piece of code that silently wrecked physical equipment. Or all the homegrown threats in the USA made through persuasion online. It could cause way more impact through software, money, and persuasion. If it can hack financial systems, disrupt infrastructure, or manipulate people into carrying out its goals, all while keeping itself hidden, then it doesn’t need a fleet of machines at all.
youtube
AI Harm Incident
2025-09-11T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyrKajmk8Cs1ucBUuR4AaABAg.AMvJm77MXa0AMx2lbR4RRP","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxUCC5uuwDJ6L7_gul4AaABAg.AMv0snWxcqFAMvlgusGHLB","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxUCC5uuwDJ6L7_gul4AaABAg.AMv0snWxcqFAMwA1nCPlab","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwV5VNsp7Qyg1cTkiZ4AaABAg.AMuVfDtlMx3AMxFtMl48Oy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzBkPsOYILTpLLM7Xd4AaABAg.AMuOqRIpSnlAMySkUXfdhw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzBkPsOYILTpLLM7Xd4AaABAg.AMuOqRIpSnlAMye1reINsv","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyLAad1baZrywknuQp4AaABAg.AMuG8qUAmn3AMvgHNYhMKU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyLAad1baZrywknuQp4AaABAg.AMuG8qUAmn3AMw_d5qiiv9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyLAad1baZrywknuQp4AaABAg.AMuG8qUAmn3AMxBCFwa0aW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugwyl1BD0Xwf4dM2hn94AaABAg.AMu9iCIW1DUAMuAsu-wdNk","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]