Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's a good tool to find people on camera feeds. Obviously at the moment face-re…
ytr_UgwkKet9v…
G
goodness gracious! I went into that AI stuff and it does know how to help me cod…
ytc_UgzJLjCK4…
G
Robots won’t be doing any of those tasks anytime soon—
Ffs we haven’t even perfe…
ytr_Ugw7kB6ZD…
G
It's not a stretch, it's a straight up misnomer. Use of gen-ai does not make the…
ytr_Ugwugici5…
G
I’ve never dealt with Ai before,
I had that Grok helper app and it wanted me to …
ytc_UgxjPW3x_…
G
If he did not use AI visuals, his idea would not reach you through the algorithm…
ytc_UgwrYd6PA…
G
It couldn't get more urgent to take action as when the robot itself gives the wa…
ytc_UgxibXdTm…
G
There clearly is no consciousness in AI. AI is a crude simulation of human thou…
ytc_Ugxy3yva1…
Comment
The problem with AGI is the definition keeps changing.
I spent 6 hours playing with Bolt.new today - a tool that writes websites.
You say 2030 something will happen. - AGI.
You do not need AGI for the labor industry to be completely reformed.
All you need is to peodice a tool that allows people with related skills to do the things they could normally not do.
I am a database expert, but ai do not know how to make websites.
But I wrote and deployed my first one today.
We aren't waiting until 2030 - hundreds of popular job types can be replaced today.
youtube
2024-11-10T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy3h1TsFdAMrvBf2kl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwEgZxHpQMvoPM9YU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxo_n0_BvtP9J8kmUh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJi9cX5Hp39fYwB_Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTlbzpMsFP1SQpPQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwamQgNuDXxJULOeax4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyv_IbSpuZ0GfA1d9F4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwplMuteW-rp_kC_jB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLOqUydsyYbTEaBRV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJn3kel_9wJjHMCwx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]