Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sort of, LLMs need to be fed hundreds (or thousands) of stories about a topic be…
ytr_Ugx35nPul…
G
I legit only use AI for persnal use and fun like profile picture for xbox and p…
ytc_UgybhqplX…
G
Ki ist noch ohne Gefühle, ich denke das es wahrscheinlich in 10 bis 20 Jahren mö…
ytc_UgyewVAWe…
G
Musk is a complete moron when it comes to AI and has no idea how it actually wor…
ytc_UgyJlzl5Z…
G
Ten years from now, this won't look realistic at all in comparison to what robot…
ytc_UgwyHITxf…
G
Torontonian here. I was at Fan Expo Toronto in 2024 after having not been there …
ytc_Ugxk5p5C3…
G
In 18 months, when AI fails to be anything more than an occasional help in some …
ytc_Ugxf0t6zS…
G
See, this is a cool use of AI that feels useful and is made to help people…
ytc_UgwbR1rPZ…
Comment
Ok here is a single saving grace.
Remember the old fax machines?
What happens when you fax a fax too many times? Like 100 to 1000 times. Little errors creep into the fax and every iteration small errors creep into the fax. Eventually you have nothing that resembles the original fax. But lets say you stop it just before the fax gets too many errors in it... Well you get what AI spits out now. Like a blurry resemblance of reality but with slight errors. AI will always have this issue and we should pay attention to this fact. Letting AI iterate itself is the wrong path. A better path to go is what we have now. The use report and update the software ourselves with no capability of the program to do that itself. It just runs....
youtube
AI Governance
2024-03-31T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyW5t9aiqD3qVBNGV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTrh36IHregjJiQkR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxC-c1fCLRXLrmoBh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznRwRNVD70Qps00594AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzMdILj8KsK67H4UvR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwIENoN5Qh4T9hLJ1N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYnqRriy_rPFzqwHV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVUwCzmMYK9Gfuzdh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxejlR28ozyyJazTGV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHjbMVoP8m2uWkS5N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]