Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Non volevo arrivare a 50 anni per vedere realizzati i film di fantascienza tipo…
ytc_UgxWju5tz…
G
I asked ChatGPT what it thought of you after I put in all the information about …
ytc_UgwGHqOEO…
G
@1:31 short answer observation to the rhyme of time in history… Human beings on …
ytc_Ugy9vEq1N…
G
I don't see why the China point is wrong? The majority of AI Researchers are Ch…
ytc_Ugx4z3qF8…
G
Yup, everyone is focused on deepfake nudes being mistaken as real when the bigge…
rdc_o5pktqu
G
I used this video for my position paper ❤ unlike AI, I DID credit you…
ytc_UgxsAcdeg…
G
Lol this dude is such a joke. This is just another agenda to push for conspires …
ytc_UgxjUMib5…
G
I can’t get AI to correctly tell me if new MacBooks have an OLED screen. How is …
ytc_Ugzd-3u-J…
Comment
You have not mentioned hallucinations and cost of running those models is very high, actually currently these are dumping prices so they have to go up. Also what about responsibility for the potential issues and hallucinations? If ai agent would cause huge losses You think corporation could sue ai company? For sure not because clauses would prevent that in contract because of halucinatory nature of those models. Another thing is training data, they already scraped everything, not the use synthetic data or worse data made by other ai that could cause even more halucinations. Ai is not improving as fast as it was between gpt 3 to 4, throwing more gpus or training data will not improve it because some other breakthrough is needed like new training algo or something. You could make interview with Ed Zitron who is very good in this topic.
youtube
AI Jobs
2026-02-24T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxSoB56cRySfXw4umx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"humiliation"},
{"id":"ytc_UgzdI_9jUT2Rjya3kGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjCER6hDLfUK7O8U94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2qo59wSnZT6vZs8V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzar8CeKJ17W8fi7iV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzWflUnjx20y_-KYp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCWe4jciiR-XyjlIh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKQ-lppRC6y2PDy0t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzfqTwPptdqcJ6w5_Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxcokgkn3B93zOwhGl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]