Colloquium Archive

Unifying principles of higher order networks

Mustafa Hajij
Assistant Professor
University of San Francisco

10/11/2022

Over the past decade, deep learning has been remarkably successful at solving a massive set of problems on data types including images and sequential data.  This success drove the extension of deep learning to other discrete domains such as sets, point clouds, graphs, 3D shapes, and discrete manifolds. While many of the extended schemes have successfully tackled notable challenges in each particular domain, the plethora of fragmented frameworks have created or resurfaced many long-standing problems in deep learning such as explainability, expressiveness and generalizability. Moreover, theoretical development proven over one discrete domain does not naturally apply to the other domains.  Finally, the lack of a cohesive mathematical framework has created many ad hoc and inorganic implementations and ultimately limited the set of practitioners that can potentially benefit from deep learning technologies. In this talk I will talk about a generalized higher-order domain called combinatorial complex (CC) and utilize it to build a new class of attention-based neural networks called higher-order attention networks (HOANs). CCs generalize many discrete domains that are of practical importance such as point clouds, 3D shapes, (hyper)graphs, simplicial complexes, and cell complexes. The topological structure of a CC encodes arbitrary higher-order interactions among elements of the CC. By exploiting the rich combinatorial and topological structure of CCs, HOANs define a new class of higher-order message passing attention-based networks that unify existing higher-order models based on hypergraphs and cell complexes. I will demonstrate the reducibility of any CC to a special graph called the Hasse graph, which allows the characterization of certain aspects of HOANs and other higher order models in terms of graph-based models. Finally, the predictive capacity of HOANs will be demonstrated in shape analysis and in graph learning, competing against state-of-the-art task-specific neural networks.

Uncover the Ghosts: Machine Learning based Spam Detection on Social Networks

Hao Yue
Associate Professor
Dept of Computer Science, San Francisco State University

10/18/2022

Cybercriminals have found in social networks (e.g., Twitter and Facebook) a propitious medium to launch massive cyber-attacks. Using compromised or fake accounts, criminals can exploit the inherent trust between connected users to effectively spread malicious content and perform scams against users. This talk will introduce two recent works on spam detection with machine learning: one leverages the differences in dissemination between benign and malicious messages and derive features from their dissemination paths among users and social communities to detect malicious messages, which is more effective under evasion tactics than traditional methods based on the content of messages or the behavior of user accounts. The other one adopts online learning to detect evasion tactics in time and self-adjust the machine learning models for detection.

Managing Memory Smartly

Jason Lowe-Power
Assistant Professor
University of California, Davis

10/25/2022

Managing the data movement is a major bottleneck in many computing systems. Today, there is an explosion in data set sizes. For instance, terabytes of active memory are required for training machine learning models. Unfortunately, at the same time there has been relatively little growth in the capacities of memory devices. These trends are leading to increasingly heterogeneous memory systems with a small amount of high-performance memory and high-capacity memories with lower performance. In this talk, I will explain why hardware-managed data movement techniques perform poorly in these heterogeneous systems, and I will describe a software-based technique, AutoTM, which outperforms a hardware DRAM cache. Further, I will discuss our ongoing work developing a "data management ISA" to enable software-directed and hardware-accelerated heterogeneous memory management.

Bio: Jason Lowe-Power is an Assistant Professor at University of California, Davis where he leads the Davis Computer Architecture Research Lab (DArchR). His research interests include optimizing data movement in heterogeneous systems, hardware support for security, and simulation infrastructure. Professor Lowe-Power is also the Chair of the Project Management Committee for the gem5 open-source simulation infrastructure. He received his PhD in 2017 from the University of Wisconsin, Madison, and received an NSF CAREER Award and a Google Research Scholar Award.
 

TBD

Jennifer Waldo
Ochs Labs, Sebastopol, CA

11/01/2022

How Collaborative is the California Collaboration Network?

Theresa Migler
Dept. of Computer Science, Cal Poly State University, San Luis Obispo

11/08/2022

Understanding how people work together is the first step towards creating a more equitable environment. In this talk we will present the construction and analysis of the California Collaboration Network. In this network vertices represent University of California and California State University researchers along with their collaborators. Two researchers are connected if they have ever published a paper together. We will discuss preliminary findings about this network with respect to the gender and ethnicity of the researchers.

Pages