Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@nelyrions1838 No, what's logical is to address the very real threat that's righ…
ytr_UgwqVWL1_…
G
This was about 7 years ago and thus before AI. But a friend of mine went in for…
ytc_UgxmlsQAe…
G
Ai is a danger for SpaceX because it can achieve in 1day what SpaceX had achieve…
ytc_UgwCBEV2A…
G
I've been using Hosa AI companion to chat and practice social skills. It feels m…
rdc_ndmolkj
G
My guess is some form of sentience has been made with different AI engines at so…
ytc_Ugy4_d-kJ…
G
If anything Ai has thought it's a tool like anything else we use. Programming is…
rdc_mt7rzep
G
I'm actually going to side with the AI consciousness side of things. I made a vi…
ytc_UgyEuNQ7F…
G
If enough people believe in an outcome it will manifest as our future . So if Ev…
ytc_UgyCMAnun…
Comment
Am a Cyber Security professional with a comfortable amount of experience with AI models. If you have anxiety about the future like I do, this video may be a little triggering (as it is for me). While AI is powerful and the capabilities are not fully understood, there is no current danger regarding AI. The threat is purely theoretical.
youtube
AI Governance
2023-07-07T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz4lUFURAGaZqrHd_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz6-5EXoXIe_VcdKul4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzRvOqBv7hJD_1jzp94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw_Ri-_VcQRkE_jVP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwp5EsFfQaq-fRsFe94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxgKgxy-0EoRz9iRHV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2sAuM7xcD1bhdry14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwFi8zXC6vHeea4NFl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxA-ouDTDqNRMZBp0h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwKDY5a34Rqblzazh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]