One of key ways that digital is disrupting the legal sector is in improving access to justice.
Digital Nation Australia spoke to Erin Kanygin, co-founder of the not-for-profit legaltech app, Bumpd, which automates the process of generating a legal claim for individuals experiencing parenting related discrimination. Kanygin is also a legal transformation lawyer at law firm Gilbert + Tobin.
According to Kanygin, before Bumpd, the only similar apps that existed in the space were automated decision-tree logic legal offerings that triaged information and directed users to speak to lawyers. This software eliminates the need for a lawyer, as it generates a product that individuals can lodge with the Fair Work Commission.
“It really gives people the opportunity no matter where they are, no matter what time of day to log on, get the information that they need and if they feel like they're ready to make a claim that they can just make it,” says Kanygin.
The main users of Bumpd are people who have lost their jobs due to pregnancy. According to Kanygin, this occurs for a whopping 18 per cent of Australian women who have lost their job when they requested or took parental leave, or when they returned to work. A common way this happens is through sham redundancies.
“Basically you go on mat leave and then someone is covering you while you're on mat leave. And they're there doing your job. And while this is all happening, all of a sudden your employer calls you and says, “Sorry, we actually no longer require this position,” but the person who was covering you during mat leave stays on, their job title changes, but they're doing all of the same work that you were doing.”
The other key barrier for those facing these unfair dismissals is the 21 day timeframe that is set for them to make a claim.
“You are either a new parent or you're pregnant, you lose your job and then in the time that you've lost your job, you then have 21 days to make a claim. And if you don't make this claim within 21 days, then you miss out,” says Kanygin.
Bumpd was created using Checkbox’s no-code software, which Kanygin says was chosen for its ease of use as well as its nuance and complexity.
“[Checkbox] sits at a really, really good intersection of being easy enough to use, but you can still sort of add the complexity that's required for an app like this.”
Checkbox is an automation software company that works with legal teams to assist them in building workflow automation and document apps without any coding experience.
According to Evan Wong, co-founder and CEO at Checkbox, “No-code software is extremely accessible, allowing those with even the most basic, day-to-day digital skills to automate processes and build applications on their own from scratch. There’s no need to know how code works or how to write it.”
“As a result, legal professionals end up increasing the value they can deliver to their businesses and clients by using tech to improve alignment and respond faster and more effectively to client needs and the changing environment. In addition, no-code removes an over-reliance on busy IT teams, bringing control and agility back to legal teams when it comes to digitalisation.”
Adopting tools within the legal sector is still lagging due to the conservative arm of law, says Kanygin.
“Especially in legal, you really have the old guard who they did things a very certain way and a very specific way,” she says.
“There are a number of partners, not just as the firm that I work with, in fact, my firm is quite good, but other firms that I've worked at where partners actually see technology as hindering the learning of lawyers because they want those lawyers to, for example, do the due diligence themselves.”
While Kanygin sees the benefits in digital improving efficiencies, increasing access to justice and adding value to the future of law, she is clear that the human element is critical.
“I work in corporate, but also with the work that I'm doing with Bumpd, a lot of this is very human rights and it's got a real impact on people's human rights. So do we really want to be training robots to making decisions that are hugely nuanced? I would say we don't.”