Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No AI will not catch up with human intelligence, at least not in our lifetime 😂😂…
ytc_UgwAf64yf…
G
Ai will only get as bad as we allow it to. Meaning if the robots take over it’s …
ytc_UgxMJJY_B…
G
Also, it feels nice to be nice. I don't feel comfortable NOT being kind. I can i…
ytc_Ugw-28Pgu…
G
Because the first person or entity to possess AGI will have near infinite power …
rdc_m95xgzg
G
he says 1 bit per second and 86400 seconds per day makes 86400 symbols per day? …
ytc_UgxiaD3Bq…
G
Sorry, the correct answer is Option C) To program an intelligent machine. The go…
ytr_UgySOeuc2…
G
Also, AI art wouldn't even exist without stuff made by humans. They're learning …
ytc_UgyIz4lC2…
G
It doesn’t matter because it doesn’t matter if there’s a soul doesn’t matter if …
ytc_Ugz7ed-8T…
Comment
in Phoenix they are everywhere now, yeah sometimes they are cheaper than uber. Lyft is generally the cheapest. The self driving car doesn't have to park where they drop you off, if could park at an off site depot. It certainly can be done more efficiently if the parking location is flexible. Santa Monica has plenty of abandoned business store fronts too. Making LA into a very walk-able city is a fantasy. In Seattle, the bus system works pretty well, a good number in downtown don't have a car. The bus isn't always the best way to get around (homeless, drug addicts, etc), many prefer to take Uber instead
youtube
2025-06-14T19:4…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxZR951aY7DHeNKuwB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBZYqwD8Y-IXp7lD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFTcSWb776R2sKD0t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRPak_gfOOjIDHnmd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaMPMzLsJyUBQR4614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx64LBJgrWkcFTrPnN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgypLD915q7-0znXtgt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnBJD46eGn7BqvRZN4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0NIbtMPTxJep_quh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPc4GhSYbDaBilLJ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"})