Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Agentic automation with proper investment in business related gields are needed.…
ytc_UgyQX1nVV…
G
I have autism and I've been doing art for about 10 years now and I feel like th…
ytc_Ugx5sXYzR…
G
This comment will disappear into the void but hopefully whoever needs to read it…
ytc_UgyENKaRq…
G
They are just letting people go to spend more on buying AI chips while protectin…
ytc_UgwC8qAkC…
G
Sidelining the potential dangers of AI is going to cost us. Everyone has been cu…
rdc_jfbajlu
G
I have never willingly interacted with an "Artificial Intelligence" gadget of an…
ytc_UgxS7OJOq…
G
all fun In games Until you Realize The Ai Is Actually going To be rejected In Ar…
ytc_Ugz4K8tSZ…
G
Asimov's three laws of robotics are more relevant now than ever before... AI can…
ytc_Ugz6wYlA8…
Comment
Ai will absolutely make legal help much *cheaper* and probably *higher quality* at the same time (can eventually surpass any/most humans).
Near future we'll see a human/machine blend. A single lawyer will become 100x as powerful. Bottom 80% of lawyers won't be needed anymore.
Ultimately you *will not want* a human lawyer when your future freedom is on the line.
So far the problem are AI Hallucinations (it thinks something is true, but isn't).
Once those get < the average hallucinations of a decent lawyer (since humans face the exact same problem) - you get a future where lawyers can handle 100x what they used to.
youtube
AI Responsibility
2023-11-28T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyFhjBfj6xlv9aKNtJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh469zY85NylY9tjN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyESrONIdmm2YyOH9t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyz36DeeVn1X1cmw0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZcXzBQBwrgT97MOJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx9WDTQhBlJ9batp0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzACn8RXnHo1Biz_Mh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHKDrypPu6pmkYHvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRNHIWMJ9FfVQ6NsV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwnA0ropx-jTcscqbZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"})