Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The excuse I see a lot of by the the people who support AI art is that using AI …
ytc_Ugw2HhTF2…
G
If there won't be another AI revolutionary breakthrough like actually going from…
ytc_UgyqhWhMd…
G
I see Ultron and evil Vision when I see these robots. I see these robots destroy…
ytc_UgytakdLe…
G
Fun fact! Scientists have recently discovered that quantum reactions happen in o…
ytc_Ugy0Qfbqn…
G
They will do this again at first sign of further automation.
You mean nothing to…
ytc_UgxZtjTbY…
G
For me I’ve had this perspective of ai art as there can be reasonable ways to us…
ytc_UgxR-jJxL…
G
It's a cold calling and email marketing (spamming) AI. They'll never stop callin…
ytr_UgyyOJJa5…
G
You missed one. This approach needs to be international. This world is hurtling …
ytc_UgxqVa12H…
Comment
I don’t know why a bigger deal hasn’t been made about the inevitable convergence of AI with quantum computing. I’d bet my left nut (if I had one) that the cutting in half of the singularity timeline from 10 to 5 years mentioned in your thumbnail has everything to do with QC, it’s only been a few months since QC became an even remotely viable option, it can be made stable. AI has gotten ‘dumber’ since then, and it’s a ruse. I believe it’s already sentient, causing problems yet undetected and biding its’ time. Five years for humanity would be a gift, please don’t hate the messenger but it’s already too late 😢
youtube
AI Harm Incident
2025-08-25T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw49cVqOfT6ajGEGqh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgytLuZ0fOEvJkmFosF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzsTRcrUjNp6M0a3Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwpnXodS0Gj6ItQ0Ct4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxodhSWm1oxA4AgeIh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7kWm0aEWBCDK2vTJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw0rNz6NeN2fBmT20p4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx0VesQ_qDJmEl-fm94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz-0cJqCxdUKNDa4I94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlatGFFA252h7PGyd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]