OASIS Litigant Portal (LP) TC

 View Only

Fw: Could Gen AI Be a ‘Pivotal Moment’ for the Legal Aid Market? (US)

  • 1.  Fw: Could Gen AI Be a ‘Pivotal Moment’ for the Legal Aid Market? (US)

    Posted 04-11-2024 10:18
    LP TC:

    Margaret Hagan is suggesting there is a need for new standards related to legal services and AI.

    Interestingly enough, I was at Suffolk Monday and just met Dean Perlam.

    Jim Cabral 

    Vice President, Court Relations 


    Visit the site | Check out the blog | Contact us 


    6e833541424946e88691c3e80361d041@infotrack.com?anonymous&ep=signature" data-auth="NotApplicable" data-loopstyle="linkonly">
    6e833541424946e88691c3e80361d041@infotrack.com?anonymous&ep=signature" data-auth="NotApplicable" data-loopstyle="linkonly" style="text-decoration: none">Book time to meet with me

     Could Gen AI Be a 'Pivotal Moment' for the Legal Aid Market?



    The legal aid community is actively exploring the use cases for generative AI, and these could differ from what Big Law has been experimenting with over the last year or so.


    In 2022, low-income Americans did not receive enough, if any, legal help for 92% of their civil legal problems, according to a Legal Services Corp. report.


    Two years later, the LSC, a nonprofit which funds civil legal aid programs, celebrated its 50th anniversary in Washington, D.C., on April 9. There, a conference room packed with legal aid attorneys from all over the country examined the progress made, the long road ahead and pondered exactly what role, if any, generative artificial intelligence could play in bridging the access-to-justice gaps in the U.S. legal system.


    Many of the generative AI-powered legal technology solutions that have hit the market over the last year or so are not accessible to most of the legal aid community, or even to many of the firms outside of Big Law's realm. This means that all exposure to the technology so far has been through free, or relatively affordable, consumer-facing tools such as OpenAI's ChatGPT or Google's Gemini. 


    The discourse around these chatbots' potential for hallucination, accuracy and bias risks has led to a lot of legal aid attorneys refraining from using such tools. Still, some programs around the country are starting to make use of the technology to not only scale their services but to make legal information more accessible within their jurisdictions.


    Can It Scale Legal Services?


    During his two decades litigating housing cases in New York City, Sateesh Nori estimated that he has likely represented about a thousand families. But the number of families he had to turn away, or what he called the "triage" that legal aid lawyers have to do daily, could easily be tenfold that amount. The appeal to leverage some form of AI to help service more clients quickly became obvious.


    "As someone who has always been interested in technology as a way to address the access-to-justice gap, I immediately started playing with AI tools and what I realized ... is that they are extremely effective at delivering information if you work with them properly," Nori said.


    Specifically, he experimented with building AI-powered tools that could take over hotline duties at legal aid organizations, which could answer some of the most recurring questions from callers, such as what are the courthouse hours?-in any language, at any time of day, and relieve some of the staff currently handling these tasks.


    "We must adopt, experiment and implement these tools in a careful way," Nori said. "In terms of resources and efficiency, I really see this is a pivotal moment where we can do 10 times, a hundred times more than what we're doing now."


    Nori himself has also been using ChatGPT to generate letters, specifically for cases where tenants get evicted for failing to ascertain a legal defense in a timely manner. With the "right parameters and prompts" and by "feeding the model the right information," he found that the chatbot was capable of generating solid first drafts in seconds.


    "Had somebody in this condition generated a letter and sent it, they would be able to have a chance to keep their home. And that's something that legal services organizations in New York City and elsewhere lack the resources to handle," he said. "We can't do these one-off services, we can't do them quickly enough."


    Democratizing Legal Information


    While lawyers' use of consumer-facing generative AI tools to deliver legal services is ultimately constrained by their ethical duties, it doesn't mean that they can't leverage these solutions to match people with the appropriate legal aid.


    Andrew Perlam, dean and professor of law at Suffolk University, for example, has built a "Legal Aid Navigator" in the ChatGPT store, using one of OpenAI's GPT models and prompting it to guide people to resources in their area.


    "The very first question that I forced the tool to ask in every case is what jurisdiction are you in? And then ask what language are you using? ... After a couple of pieces of information, it directs me to resources in my jurisdiction, including websites and phone numbers," Perlam explained.


    The chatbot can answer questions for users in a variety of legal binds-"I was fired from my job. How can I collect benefits?" or "I received a debt collection letter. What should I do?"


    Perlam noted that he is also working with courts around the country to automate legal documents, allowing residents to complete more forms on their own and file them directly in court, or what he referred to as the "Turbotax of legal documents."


    "Is this a good substitute for a lawyer? Not necessarily. But can it help when people don't otherwise have access to legal services? I think it is and it's better than what many people currently receive," he said.


    Auditing Big Tech: The 'Leaderboard' Approach


    Regulators and legislators in the U.S. and abroad are already trying to work with the Silicon Valley companies behind some of the most prominent generative AI models to learn more about how they were trained and anticipate how they may end up being used.


    But the broader justice community could have a bigger role to play in guiding how these companies are releasing their models, noted Margaret Hagan, executive director of the Legal Design Lab and a lecturer at Stanford Law School and the Stanford Institute of Design.


    "This means going from a reactive stance, where we as the justice community and lawyers are kind of waiting to see what tech companies like Google, Microsoft, OpenAI and Anthropic [are] going to do, what kind of models are they going to release. Instead of being reactive to what the technologists are doing, we need to be proactive," she said. "We need to be in there with these tech companies right now helping them set the standards and making the models better."


    A practical approach to doing so is taking what Hagan called a "leaderboard approach," meaning evaluating the different available models granularly and rating them based on their ability to answer legal prompts.


    "We believe that by constantly benchmarking and auditing these technology companies, we can put pressure on them to improve their performance," she said.