Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look, I am pro-AI and here's why: not everyone has an ability to make the skill …
ytc_UgwPG3MXw…
G
What is at the end of the debate they told us that Ben was the robot…
ytc_Ugz52nU4F…
G
They have warned us for decades in science fiction movies about AI becoming sent…
ytc_UgxpZcxeD…
G
@UnboxedPerceptionin 1999 & 2005 he wrote books on AI/Robotics and the future. H…
ytr_UgyNiRsEY…
G
@tiagojordao4105 But you cannot effectively regulate AI. If online porn was ille…
ytr_Ugy2uGnEz…
G
Didn’t read a book on it, but watched it play out in real time when the new boss…
ytr_UgzSCjtK2…
G
Ai art is bad, period. If it didn’t take art from real artists it’d be fine but …
ytr_UgyLCtoo1…
G
Of course, we will definitely have some sort of a Robot Rights Act. I'd say by 2…
ytc_UgwaEYHPU…
Comment
Artificial Intelligence as a term has been conflated with it's subset of Machine Learning.
AI as a term was coined in 1956, and referred to any system that attempted to mimic some aspect of human Intelligence. This includes hard coded rule based systems and statistical systems.
LLMs - what we refer to as AI today - largely corresponds to deep learning which is a subset of Machine Learning which itself comes under the purview of AI. Machine Learning aims to mimic one specific aspect of human Intelligence, that being "the ability to learn and adapt". AGI refers to a much broader scope: mimicking nearly all aspects of human Intelligence such as reasoning, planning, creativity etc.
youtube
AI Responsibility
2025-10-02T03:2…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytr_UgwgmLGGEIJDsRgXdVd4AaABAg.ANixtFZ1o39ANjdGPuKiK6","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwgmLGGEIJDsRgXdVd4AaABAg.ANixtFZ1o39ANkIDiwZCuk","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgyICMnW0R-UGE_VKsV4AaABAg.ANiBpJl2R0ZANmLSfxfh7g","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwVmJOlpvXcHjH4ukV4AaABAg.ANi4lAhwmSOANi8iMC8lJf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw1ixDuhwBMajoNn9t4AaABAg.ANi-G6Jq6glANiqhfObbQF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw1ixDuhwBMajoNn9t4AaABAg.ANi-G6Jq6glANjS5G5uutf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw1ixDuhwBMajoNn9t4AaABAg.ANi-G6Jq6glANlA6xgYssJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw1ixDuhwBMajoNn9t4AaABAg.ANi-G6Jq6glANqKiSV4XRn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw1ixDuhwBMajoNn9t4AaABAg.ANi-G6Jq6glAO-s8wt-3DM","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxAwyQxKmiWKaFINDl4AaABAg.ANhsHKupHmdANhyht3J9rj","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})