Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This just shows they’d do anything to pay more money, I can see why anime charac…
ytc_Ugx8q6w5T…
G
it doesn't matter ... in the long run, they will get the same facial recognition…
rdc_ektdbez
G
There is not a standard scientific term for soul but there are religions ( e.g. …
ytc_UgypM-pDN…
G
it's nothing more than an amaglation of noise data to fit into an image, to mimi…
ytr_Ugz-FIcrN…
G
The main argument I see for AI as reference is "But what if what I want is too n…
ytc_UgxOGow72…
G
So-called “artificial intelligence” is not intelligent in the slightest. Nor is …
ytc_UgzmY33U6…
G
I cannot remember when I saw a good show that had any real substance. Perhaps it…
ytc_UgyHZKzHd…
G
They have a long way to go. With all the muscles in the human face it will take …
ytc_Ugz4Z5ETA…
Comment
I have been saying this time and time again.... in any organisations or systems, there is an effort to reduce human error and not to eliminate them. It is impossible as, as long as humans are in the chain, mistakes will be made. The best we can do is to learn, overcome and move on to minimize human error. Now with AI.... Since AI is not even the replica of human brain, but makes human brain mistakes, we are literally undoing decades of efforts of human error reduction. With AI, companies are literally introducing human error into the system. Worse, most of it are rookie mistakes. This is the future that we all saw but was ignored by greedy companies. Well..... this is it..... here we are now....
youtube
AI Jobs
2026-02-04T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwzb99Wdb-iPxdUlcV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugze2Ka13wuH9o9tEPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-WNSoKKeQJAfyrxt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7mDKOZIXQaomOJmF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxK4w9J74TYr7TRLUF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzq9OgXHEg2SPtUmLN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjEjbtnOGyavJ8dh14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxORXAAPyKgZ0ZaDLh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1pSwCIWOoXU7dZ0N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxh6wjsR0UDd_R440t4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}
]