Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do use ai for references (I have trouble visualizing sometimes) but the whole …
ytc_UgyKm48S2…
G
This is basically the plot of the movie Captain America The Winter Soldier, even…
ytc_UgzwyC89j…
G
My son and I watched colossus: the Forbin project last night. It was made in 197…
rdc_jcnnzkz
G
Hi there! It's great that you find the dialogue cool. AI-powered robots like Sop…
ytr_UgyKtcLOE…
G
Telegram needs to be monitored like any other app enough with this anonymous, un…
ytc_Ugyvr2APl…
G
We are creating a successor species and we humans are playing a bigger role in i…
ytc_UgySJPtGe…
G
Capitalism and AI cannot coexist especially if factories replace its workers wit…
ytc_UgyYPuEv2…
G
PLEASE DO MORE ai stuff like this and your other video. they are so great! prett…
ytc_Ugzov5CRp…
Comment
The problem with AI is that it is designed to produce an output with a minimum percentage of accuracy. I think this is the bottleneck of AI since in the real world, dealing with people would always result in ambiguity and humans have the ability to acknowledge that they are not familiar with that particular subject and refrain from providing an answer to a question .
If we train AI to say no to the questions which it is not sure then it will reduce the overall accuracy of the model as well.
youtube
AI Responsibility
2025-10-04T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzk8ujX1NBX7LDUibd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdVY6KY1M0DZKI3np4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzpWVSVgiw3TVmCIs94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsgZKQEtUlgpcv-vV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6jGafEaiNUR1LC954AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyuILRjEVjNnYfJ4JZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5E3OpDqGZ6Sqm0eV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxdQxR0Rlz7k5KeGU94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTuR2Yvwn6mAhL_tF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyqkCNNQJBGO0MwT6F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]