Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even if this was a video game you can't be that stupid to think that they can't …
ytc_UgyEqdjgm…
G
@B@BurntRAMgreat bud, just spent the day at the country club
hope things get b…
ytr_UgxVwzdiu…
G
alright AI hating hikikomoris; you've had your fun, now disperse so people who a…
ytc_UgydRTVYH…
G
Quick note: did our civilization fold and crumble when agricolture was invented?…
ytc_Ugw-WmRKB…
G
@ndirangugichuki6260 because you can type your comment into chatgpt and ask if i…
ytr_UgyTiJIrw…
G
When you talk about OpenAI's financial stuff, I remember how Rumora helped me cu…
ytc_UgwD4oXhE…
G
No, actually. As an AI lover I do understand that it takes years to git gud, tha…
ytr_UgwAH88LD…
G
This isn’t AI. We made it with visual effects years ago. Before you could do thi…
ytr_UgwuzfKwx…
Comment
Id argue there's a fourth major problem. When you give a prompt it is essentially finding what answer is likely based on what it's seen before in training. It's a lot more complicated and mathy than that but that's the gist. Problem is all that training comes from us, if you try and train the Ai off it's own results you start to run into problems.
Soo, how exactly is Ai using current models gonna replace people? Say Ai replaces us in some field. It won't be getting new data, new ideas, new concepts. At best it'll be stagnant.
Current Ai is a reflection of ourselves. It can only do what we currently do, and can't advance on its own.
youtube
2026-01-23T02:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgiDHcpIe7EOE44Fd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4YV7zvuN8S0qlMv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyYSu47-_ZvBUHwOJp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5pRBBIVYKd-qbv_t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3pY50R0a7Vd7-O4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYTkx7pv7ic0ulhOl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOWE7-D7rNSpckpD94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgvL2b6LE0_0j5mdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9VQ0TzbYBcAFodot4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgySY6I0fV96WWo88tp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]