ASU Law to host mock NBA trade competition in November

Student-run event is the first of its kind at ASU

August 25, 2023

Students with a head for business and a passion for sports will be happy to learn of a first-of-its-kind event being hosted by the Sandra Day O’Connor College of Law at Arizona State University this November.

The Mock NBA Trade Deadline Competition will take place Nov. 2–4 at the Beus Center for Law and Society in downtown Phoenix. The national competition will involve undergraduate and graduate students from all disciplines and across the country, and will allow them to learn what it’s like to work in the front office of an NBA team as they draft players. It’s the first program of its kind to encourage collaboration across analytics, legal, business and scouting in sports. An older man and a younger man stand next to each other smiling. ESPN NBA front office insider Bobby Marks (left) with Kyle Goodier, a 3L student at the Sandra Day O'Connor College of Law at Arizona State University. Marks will be a judge in ASU Law's NBA Mock Trade Deadline Competition this November. Photo courtesy Kyle Goodier Download Full Image

The ASU Law-hosted competition is the brainchild of Kyle Goodier, a rising 3L in the Juris Doctor (JD) and Master of Sports Law and Business (MSLB) programs. Goodier said that students interested in the business of sports have very few opportunities to learn the trade from experts in the field, and this event hopes to bridge that gap.

“There is so much demand for working in a front office,” he said. “Students need to build their network and demonstrate that they can provide value to a team or an agency to get a job. This event is intended to allow students to build that network and show industry executives that they can provide this value.”

A previous, similar competition that was held at Tulane University became one of the largest networking events for the NBA, with coverage in ESPN and The Athletic.

Big names in the NBA and sports worlds have already signed on to be judges for the ASU Law-hosted event, including Bobby Marks from ESPN; Doug Collins, a 50-year NBA veteran and senior advisor for the Chicago Bulls; Eric Pincus from Bleacher Report; and Seth Partnow of StatsBomb.

“I decided to start this competition to provide students with an opportunity to learn front-office operations in a way that provides them feedback from real NBA professionals,” Goodier said. “The goal is for passionate and driven students to leave the event with a better understanding of the NBA, tangible results to share, and a new network of NBA and student contacts.”

Multiple ASU departments have already gotten in on the action, with the Global Sport Institute at ASU signing on as a sponsor.

The student-run event has already surpassed expectations for Aaron Hernandez, assistant dean of the Allan "Bud" Selig Sports Law and Business Program.

“Each year, a student will approach me about their ideas for a competition, clinic or initiative. Each year, I ask them about their plan of execution,” he said. “That’s usually where the conversation stops. Kyle researched the cost, leveraged resources, pitched a major proposal and assembled peers to help execute his vision. I am very proud of Kyle and how he has gotten this off the ground.”

Participants in this event will have to make trading decisions, defend their positions in front of NBA executives and ensure that their trades comply with the NBA regulatory framework. They can also expect to have to hold their own in a room full of sports professionals, with plenty of opportunities for networking with other students and experts. 

“In my opinion, this is one of the more valuable competitions you can participate in if you want to be a general manager or involved in basketball operations,” Hernandez said. 

Registration for the event is now open. 

Lindsay Walker

Communications Manager, Sandra Day O’Connor College of Law

Opening the black box

ASU researchers innovate decision-making in design space exploration

August 25, 2023

Every second of the day presents us with choices, from deciding what to wear in the morning to picking from a menu at dinner. Whether a decision is trivial or life-altering, decision-making is a fundamental element of the human experience.

It’s always easy to question whether a person made the right choice. Sometimes it’s impossible to tell until consequences are revealed later. Two ASU researchers look at a computer. Professor Aviral Shrivastava (left) and doctoral candidate Shail Dave (right) are working on research to improve design space exploration, a crucial component in designing deep-learning accelerators that optimize how efficiently computers run artificial intelligence algorithms. Photo by Erika Gronek/ASU Download Full Image

In hardware and software architecture domains, engineers use a technology called design space exploration to assist in evaluating choices during the computer architecture design process to identify the best-performing design among the available options.

Design space exploration technology can choose a preferred option based on desired outcomes like speed, power consumption and accuracy. The technology can be applied to a variety of applications, from object or human recognition software to high-level microelectronics.

Deep learning, a method in artificial intelligence inspired by the human brain, teaches computers to process data. Designs of deep-learning accelerators, which are computers that specialize in efficiently running deep-learning algorithms for artificial intelligence, rely on design space exploration to choose from their extensive lists of options. Because some of these accelerator designs have billions upon billions of choices to evaluate, existing processes for optimization can take days or even weeks to complete, even when evaluating only a small fraction of the choices.

The process is further complicated by black box explorations, which deep-learning accelerators rely on to make decisions. Black box explorations are designed to process information without revealing any details about their reasoning.

Shail Dave, a computer engineering doctoral candidate in the School of Computing and Augmented Intelligence, part of the Ira A. Fulton Schools of Engineering at Arizona State University, is working to fix this problem with explainable design space exploration, a framework of algorithms and systems that will enable researchers and processor designers to understand the reasoning behind deep-learning accelerator designs by analyzing and mitigating the bottlenecks slowing down the process.

Delving into decision-making 

“Typically, hardware and software designs are explored and optimized through black box mechanisms like evolutionary algorithms or AI-based approaches like reinforcement learning and Bayesian optimization,” Dave says. “These black box mechanisms require excessive amounts of trial runs because of their lack of explainability and reasoning involved in how selecting a design configuration affects the design’s overall quality.”

By streamlining the accelerator’s decision-making process, Dave’s research allows design methods to make choices much faster, taking only minutes compared with the days or weeks it can take existing models to process this information. As a result, design optimization models are smaller, more systematic and use less energy.

Dave’s research offers an alternative that not only improves search efficiency, but also helps engineers reach optimal outcomes and get insights into design decisions. By understanding the reasoning behind design choices and related bottlenecks, the method can analyze available design points at every step of the process and determine the good and bad options before offering its decision, which is made deliberately by the technology after evaluating the most promising options available.

Dave notes that his algorithms can also fine-tune design parameters in various use cases. For example, the algorithms can budget power in batteries for maximum efficiency in devices ranging in size from a smartphone to powerful processors in a supercomputer while finishing the application processing in the desired time.

Most notably, Dave’s algorithm can explore design solutions that are relevant to multiple applications, such as those that differ in functionality or processing characteristics, while also addressing their inefficiencies in executing the product. This is important because many of today’s artificial intelligence-based applications require multimodal processing, which means one accelerator design may need to be used to complete multiple tasks.

“The great thing about this work is that it formalizes how to leverage a system’s information for making informed decisions and can be applied across a wide variety of uses and different industries where these target systems need improvements and explainability in the process,” Dave says. “We’re analyzing how to get the best performance for any given architecture configuration with different constraints, depending on what the user prioritizes as the most important need.”

Presenting algorithm findings

Dave’s paper on this research, titled “Explainable-DSE: An Agile and Explainable Exploration of Efficient Hardware/Software Codesigns of Deep Learning Accelerators Using Bottleneck Analysis,” has been accepted into the Association for Computing Machinery’s 2024 International Conference on Architectural Support for Programming Languages and Operating Systems, or ASPLOS, a premier academic forum for multidisciplinary computer systems research spanning hardware, software and the interaction between the two fields.

In addition, Dave’s research was awarded second place in the graduate category of the 2022 Association for Computing Machinery Student Research Competition, recognizing his work as being among the most important student research in computer science and engineering in the world.

Aviral Shrivastava, a professor of computer science and engineering in the Fulton Schools and Dave’s mentor and collaborator on the research, says he is proud to see Dave celebrated for his hard work.

“It takes a very motivated student to achieve this kind of success,” Shrivastava says. “There is so much thought, intricacy and detail that goes into research like this. You really need to work with strong researchers, and Shail has been one of them on my team.”

Real-world applications

Dave and Shrivastava are also working together to apply this research to the semiconductor industry with the Artificial Intelligence Hardware program for Semiconductor Research Corporation, which addresses existing and emerging challenges in information and communication technologies and supports research to overcome the challenge of transforming the designing and manufacturing of future microchips.

Shrivastava notes that one of the beneficial real-world impacts of this research is that it drastically reduces the energy requirements and carbon footprint produced by this work compared to that of existing models and processes.

“Training neural networks can have the same carbon footprint as a trans-Atlantic flight,” Shrivastava says. “Being able to reduce the overall carbon footprint by making this technology more efficient will have a great global impact.”

With its significant positive environmental impact and ability to guide decision-making processes more effectively, Shrivastava sees the impact of the research having implications across almost all fields and industries.

“By designing these accelerators in a more formal, systematic and automatic way, we are saving people from endlessly searching through space without any guidance,” he says. “This technology offers an introspective, intelligent and insightful way to make sense of all the information available.”

Annelise Krafft

Communications Specialist, Ira A. Fulton Schools of Engineering