Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist I don't feel is different from what we do (taking inspiration from …
ytc_Ugx6FehiA…
G
Why would you want to make something that is smarter than humans? We humans are …
ytc_UgzsTBAPd…
G
It is astonishing that Isaac Asimov wrote a series of books about the robot who …
ytc_UgyonEana…
G
I read an article the other day about AI and therapy. The article said the follo…
ytc_UgzhPP1cW…
G
I spent many years in very demanding art classes honing extreme drawing and pain…
ytc_Ugz2guRGs…
G
Get rid of all the planets big nukes and the planet wont get destroyed by the ef…
ytc_Ugy6rpYz_…
G
Keep in mind that ai takes from other resources to make whatever it needs to mak…
ytc_UgzbuCd_M…
G
If an artist "trained" himself using other people works, should they be all payi…
ytc_UgwcWaTmj…
Comment
I’m doubtful
His premise was the advancement of AI continuing along the current trajectory
That’s flawed logic. As AI begins to participate in creating AI, it will no longer be being developed at human (linear) speeds. I am not saying AI will be being developed at exponential speeds, but certainly many orders faster than what’s happening currently. “In ten years….” is a popular saying but who knows what so much as 10 days looks like once AI starts developing AI in earnest
youtube
AI Jobs
2024-05-26T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzVRCO-GQw5orq17LJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsL17_yAmksXCn_0p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwY3Pi8K6vh4pKj4xp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyX2PrAikGO4RcVIk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxu1Z2DeaXizjKgoKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyptcLSVpVkjHsEhvR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxh-b9_JdH295JHa_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLjP3cKc4XvRbq0tB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOB8xCwbmLy8BRwpd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzc4HX1LWaI2c4Ez7V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]