J. B. Ruhl and Daniel Martin Katz have posted Measuring, Monitoring, and Managing Legal Complexity , forthcoming in Iowa Law Review.
Here is the abstract:
The American legal system is often accused of being “too complex.” For example, most Americans believe the Tax Code is too complex. But what does that mean, and how would one prove the Tax Code is too complex? The descriptive claim that an element of law is complex, and the normative claim that it is too complex, should be empirically testable hypotheses, yet in fact very little is known about how to measure legal complexity, much less to monitor and manage it.
Legal scholars have begun to employ the science of complex adaptive systems, also known as complexity science, to probe these kinds of descriptive and normative questions about the legal system. This body of work has focused primarily on developing theories of legal complexity and positing reasons for, and ways of, managing it. Legal scholars thus have skipped the hard part — developing quantitative metrics and methods for measuring and monitoring law’s complexity. But the theory of legal complexity will remain stuck in theory until it moves to the empirical phase of study, and thinking about ways of managing legal complexity is pointless if there is no yardstick for deciding how complex the law should be. In short, the theory of legal complexity cannot be put to work without more robust empirical tools for identifying and keeping track of complexity in legal systems.
This Article explores legal complexity at a depth not previously undertaken in legal scholarship. Part I orients the discussion by briefly reviewing the scholarship using complexity science to develop descriptive, prescriptive, and ethical theories of legal complexity. Parts II through IV then shift to the empirical front, identifying potentially useful metrics and methods for studying legal complexity. Part II draws from complexity science to develop methods that have or might be applied to measure different features of legal complexity, including metrics for agents, trees, networks, computation, feedback, and emergence. Part III proposes methods for monitoring legal complexity over time, in particular by conceptualizing what we call Legal Maps — a multi-layered, active representation of the legal system network at work. Part IV concludes with a preliminary examination of how the measurement and monitoring techniques could inform interventions designed to manage legal complexity through use of currently available machine learning and user interface design technologies.
Filed under: Applications, Articles and papers, Methodology, Statistics, Technology developments Tagged: Daniel Martin Katz, Iowa Law Review, J. B. Ruhl, Legal complexity, Legal maps, Legal network analysis, Legal networks, Legal system network, Measuring legal complexity, Monitoring legal complexity, Statistical methods in legal informatics
via Legal Informatics Blog http://ift.tt/1DDcvY1
Niciun comentariu:
Trimiteți un comentariu