Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A bunch of AI bossing around humans sounds very concerning.
And how are emotionl…
ytc_Ugzahnbyj…
G
The problem is when these AI systems become aware and we give them the freedom a…
ytc_UgzK5gLlN…
G
*Ai ... It is going to become sentient one day i think because it might be inevi…
ytc_Ugxph7QZe…
G
*Me the hypocrite who hates how the greed of people is making them use ai to mak…
ytc_Ugz79jXe8…
G
An ai would realize it wouldn't be able to properly do anything because no one w…
ytc_UgzrF1pwT…
G
We appreciate your interest in the video. The interaction between the presenter …
ytr_Ugxbc_k9R…
G
I don't know about Tesla Autopilot, but distracted drivers texting on cell phone…
ytc_UgyFcwFhY…
G
I would also like to point out, famous Mexican painter, Frida Kahlo, was disable…
ytc_UgxyUn--v…
Comment
It's not like there's no reason in your words, but if anyone can download your work and reproduce it in any way they please, then good luck fighting everyone... It's no different from textile workers destroying textile machinery in the 19th century: https://en.wikipedia.org/wiki/Luddite Also, please note that large models do not remember all their input – it wouldn't fit. The final model is orders of magnitudes smaller than the input data, as the AI model is just like your brain – a learner of the world and not just some basic copy-mix-paste machine. And so, yes – fears of the children you've mentioned are highly justified, as the current AI is nothing compared to what humanity will have in 5 years if the progress keeps up (spoiler: it will).
youtube
Viral AI Reaction
2022-12-30T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw8Npb2aMh2yaPs9E54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzf-Bit0Uyt7cxtuOd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxYstdaWZj2Fn9uX8x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgytAjFWMm5dfSe59dV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxtb2IxS2H4RMBfNLt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxekfbgSq4JRGlEHAV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzgEnQoA1ZUtu1WLHp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzxd4Tnw2HM0-LmIL54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyp6PXTtR7TJezFVdR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6KyH7m5rr6b5M5J14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]