Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Jedi answer is the perfect example of AI BIAS. Not smart, not witty. Plain bias …
ytc_UgzRhEsIi…
G
I rode a Waymo in LA with no problem, got me from Venice Beach to Inglewood in l…
ytc_UgzNlf5VS…
G
Using AI on a Roblox appeal that would probably get sent to a bot is really funn…
ytr_UgxFG6odE…
G
i had a line i used all the time saying “anything can be turned to art” i didnt …
ytc_UgwxOrhFo…
G
Wth is AI and tech people doing with this world.
These freaking people give me…
ytc_UgzuORkfE…
G
I just want to point out why chatgpt responded the way it did in saying "i didn'…
ytc_UgzQsy85r…
G
Dude, you are drinking the Kool-aid, believing the hype. I've unsubscribed. A se…
ytc_UgxBYeWCC…
G
But but it INSPIRED like the REAL LIFE! GRRRR RUFF RUFF RAUR AI GOOD YOU BAD STO…
ytc_Ugwpj4C01…
Comment
This understanding of gpt-5 being way more powerful than gpt-3 and 4 is not correct tho. There's nothing to back this up, GPT-4 is already spending millions of dollars per day in computer power and it's able to deal with 32k tokens, this still very low amount and even if this is bumped to the house of 100k/1000k tokens in gpt5 this doesn't mean that gpt5 will be the next AGI or anything like that. The computer power and cost to be able to handle big contexts using transformers is already very high and rely on the state of art in hardware.
All that means that gpt4 still very limited from high perspective. And even if we double/triple their parameters and supported tokens in gpt5 it doesn't mean it will be that much of capacity increase. And the main factor is the cost with infrastructure, that means new technologies will need to rise in order create a proper gpt5/gpt6 without spending billions in infra per day.
youtube
AI Governance
2023-06-23T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz6UszeRESlUJjRqTN4AaABAg.99qstjFN27499sr0KgKwJQ","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgyUEuzseGHnHLmzyV14AaABAg.99qmTNCSJ1G99t-ZpRuPCt","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwI_BBhI50IfodzfkF4AaABAg.99qj9yMcdXQ99rDOgkwCNu","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwR1RDYnKQ-dmgjenJ4AaABAg.99qiFkuLfe899qiNjmr2Yp","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugy9JnZEG2KiNfWvtUN4AaABAg.9rJ4JlkxeoP9rJS8hh66Oi","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy9JnZEG2KiNfWvtUN4AaABAg.9rJ4JlkxeoP9rJXSQz2txO","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugy9JnZEG2KiNfWvtUN4AaABAg.9rJ4JlkxeoP9rJXeLWU8xH","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy9JnZEG2KiNfWvtUN4AaABAg.9rJ4JlkxeoP9rJnCsqkui4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwExSE3E3vBiIPp3jF4AaABAg.9r8kexNAFA39rV6m78Z5kt","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwosS99uFjb2tVMwfh4AaABAg.9r3AznaUHRA9rawQZL8UDk","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]