Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Eeeh what ? wasnt halucination basically nearly solved just recetly with that ne…
ytc_Ugwn3Eb4a…
G
@wakeoflove "there is no good reosen ai should be put into medicine and other a…
ytr_UgyBjYK0o…
G
I really can't forgive using Ai for computer viruses D*mn those people to heck t…
ytc_Ugz8uoOud…
G
First 2 are sex robots with functional moaning sensors and love tunnels and AI v…
ytc_Ugx9eEuAA…
G
My friend decided to say that if I don't break up with my ex, the would read my …
ytc_Ugy1UxxNL…
G
I don't find it odd that AIs would harm humans to prevent some other larger harm…
ytc_UgwqDlhcy…
G
I may just be an old fashioned Internet enjoyer, but what in tarnation is an AI …
ytc_Ugz9VqpmT…
G
Guess where Telsa test and developed the self driving from? It's not US or AU or…
ytc_Ugz8RSA8s…
Comment
@roycampbell586not if you put your own local model large language model AI on your own computer. Then it’s free, no one can delete it but you, and you can even use it offline, not connected to the internet. If you want to text with it or call it on the phone, you can connect it from your computer to your phone.
youtube
AI Harm Incident
2025-08-03T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugxsu1BlJRRxPCydPwB4AaABAg.ALLl_oCSIcoALLvcLf0d_d","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugxsu1BlJRRxPCydPwB4AaABAg.ALLl_oCSIcoALM57N3P_aG","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxg20T0lmqrOajf4RJ4AaABAg.ALL6AzsDk1dALLSWPB0_ww","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALQCO91G7Jl","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALSWJC4Unat","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALTbRClH9T-","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALXTcIpXaO3","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzwKorKamWWinF6BZF4AaABAg.ALKef_FaMsvALLRHU1jLI5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzwKorKamWWinF6BZF4AaABAg.ALKef_FaMsvALN8EulT1Tt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxXl1haNQG9hiElcgp4AaABAg.ALKS3lw9mBzALLuJGnMDMu","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]