Inside Our ProgramProgram Events
Events
-
Nov21
EVENT DETAILS
Description
To forge healthy and productive Human-AI ecosystems, researchers need to anticipate the nature of this interaction at every stage to stave off concerns of societal disruption and to usher in a harmonious future. A primary way in which AI is anticipated to become part of human life is through augmenting human capabilities instead of replacing them. What are the greatest potentials for this augmentation in various fields and what ought to be its limits? In the short term, AI is expected to continue to rely on the vast recorded and demonstrated knowledge and experience of people. How can the contributors of this knowledge feel adequately protected in their rights and compensated for their role in ushering in AI? As these intelligent systems are woven into the lives and livelihood of people, insight into how they operate and what they know becomes crucial to establish trust and regulate them. How can human privacy be maintained in such pervasive ecosystems and is it possible to interpret the operations, thoughts, and actions of AI? IDEAL will address these critical questions in a 3-part workshop as part of its Fall 2024 Special Program on Interpretability, Privacy, and Fairness, which will span 3 days across 3 IDEAL campuses
TIME Thursday, November 21, 2024 at 8:30 AM - 5:00 PM
CONTACT Indira Munoz indira.munoz@northwestern.edu EMAIL
CALENDAR Department of Computer Science (CS)
-
Nov21
EVENT DETAILS
Description
To forge healthy and productive Human-AI ecosystems, researchers need to anticipate the nature of this interaction at every stage to stave off concerns of societal disruption and to usher in a harmonious future. A primary way in which AI is anticipated to become part of human life is through augmenting human capabilities instead of replacing them. What are the greatest potentials for this augmentation in various fields and what ought to be its limits? In the short term, AI is expected to continue to rely on the vast recorded and demonstrated knowledge and experience of people. How can the contributors of this knowledge feel adequately protected in their rights and compensated for their role in ushering in AI? As these intelligent systems are woven into the lives and livelihood of people, insight into how they operate and what they know becomes crucial to establish trust and regulate them. How can human privacy be maintained in such pervasive ecosystems and is it possible to interpret the operations, thoughts, and actions of AI? IDEAL will address these critical questions in a 3-part workshop as part of its Fall 2024 Special Program on Interpretability, Privacy, and Fairness, which will span 3 days across 3 IDEAL campuses
TIME Thursday, November 21, 2024 at 8:30 AM - 5:00 PM
CONTACT Indira Munoz indira.munoz@northwestern.edu EMAIL
CALENDAR Department of Computer Science (CS)
-
Nov21
EVENT DETAILS
Mix and mingle with fellow CS students and faculty every last Thursday of the month.
TIME Thursday, November 21, 2024 at 9:00 AM - 11:00 AM
LOCATION 3514, Mudd Hall ( formerly Seeley G. Mudd Library) map it
CONTACT Wynante R Charles wynante.charles@northwestern.edu EMAIL
CALENDAR Department of Computer Science (CS)
-
Nov22
EVENT DETAILS
Description
To forge healthy and productive Human-AI ecosystems, researchers need to anticipate the nature of this interaction at every stage to stave off concerns of societal disruption and to usher in a harmonious future. A primary way in which AI is anticipated to become part of human life is through augmenting human capabilities instead of replacing them. What are the greatest potentials for this augmentation in various fields and what ought to be its limits? In the short term, AI is expected to continue to rely on the vast recorded and demonstrated knowledge and experience of people. How can the contributors of this knowledge feel adequately protected in their rights and compensated for their role in ushering in AI? As these intelligent systems are woven into the lives and livelihood of people, insight into how they operate and what they know becomes crucial to establish trust and regulate them. How can human privacy be maintained in such pervasive ecosystems and is it possible to interpret the operations, thoughts, and actions of AI? IDEAL will address these critical questions in a 3-part workshop as part of its Fall 2024 Special Program on Interpretability, Privacy, and Fairness, which will span 3 days across 3 IDEAL campuses
TIME Friday, November 22, 2024 at 8:30 AM - 5:00 PM
CONTACT Indira Munoz indira.munoz@northwestern.edu EMAIL
CALENDAR Department of Computer Science (CS)
-
Nov22
EVENT DETAILS
Description
To forge healthy and productive Human-AI ecosystems, researchers need to anticipate the nature of this interaction at every stage to stave off concerns of societal disruption and to usher in a harmonious future. A primary way in which AI is anticipated to become part of human life is through augmenting human capabilities instead of replacing them. What are the greatest potentials for this augmentation in various fields and what ought to be its limits? In the short term, AI is expected to continue to rely on the vast recorded and demonstrated knowledge and experience of people. How can the contributors of this knowledge feel adequately protected in their rights and compensated for their role in ushering in AI? As these intelligent systems are woven into the lives and livelihood of people, insight into how they operate and what they know becomes crucial to establish trust and regulate them. How can human privacy be maintained in such pervasive ecosystems and is it possible to interpret the operations, thoughts, and actions of AI? IDEAL will address these critical questions in a 3-part workshop as part of its Fall 2024 Special Program on Interpretability, Privacy, and Fairness, which will span 3 days across 3 IDEAL campuses
TIME Friday, November 22, 2024 at 8:30 AM - 5:00 PM
CONTACT Indira Munoz indira.munoz@northwestern.edu EMAIL
CALENDAR Department of Computer Science (CS)
-
Dec2
EVENT DETAILS
Monday / CS Seminar
December 2nd / 12:00 PM
Hybrid / Mudd 3514Speaker
Izzy Grosof, Northwestern UniversityTalk Title
Multiserver Stochastic Scheduling for Large-Scale ComputingAbstract
Large-scale computing systems are massively important, using over 1% of the world's electricity. It is vital that we can forecast the performance of these systems, and control the systems to be both fast and resource-efficient. Stochastic modeling and scheduling theory are key tools towards that goal. Prior scheduling theory is not equipped to handle large multiserver systems, with little known about modeling the performance of such systems, much less how to optimally control such systems.
I'll cover my recent results in two important multiserver models. First, the M/G/k, a model where each jobs requires the same amount of resources. Next, the multiserver-job model, a model where different jobs require different amounts and kinds of resources.
In each model, I'll present theoretically-motivated scheduling policies which are practical to apply to real systems, from datacenters to microservices architectures. Our results include optimal performance in the limit as system load approaches capacity, as well as throughput optimality and performance analysis.
Biography
Izzy Grosof is a new assistant professor faculty in the IEMS department here at Northwestern. Their research focuses on stochastic queueing, resource allocation, optimal scheduling, and performance analysis. They received their PhD in CS from Carnegie Mellon in 2023, and subsequently completed postdocs at Georgia Tech and UIUC. Their research received the Best Student Paper, Best Paper, and Best Dissertation awards from the INFORMS, SIGMETRICS, and Performance conferences.Research/Interest Areas: Stochastic queueing theory for large-scale computing, emphasizing scheduling theory
---
Zoom: https://northwestern.zoom.us/j/92919202717?pwd=eMIa3miqyHV7KnbzFKEhbYU39sB35V.1
Panopto: https://northwestern.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=7047b451-f78b-4e29-9015-b22e015579c8
DEI Minute: tinyurl.com/cspac-dei-minuteTIME Monday, December 2, 2024 at 12:00 PM - 1:00 PM
LOCATION 3514, Mudd Hall ( formerly Seeley G. Mudd Library) map it
CONTACT Wynante R Charles wynante.charles@northwestern.edu EMAIL
CALENDAR Department of Computer Science (CS)
-
Dec4
EVENT DETAILS
Wednesday / CS Distinguished Lecture
December 4th / 12:00 PM
Hybrid / Mudd 3514Speaker
Michael Franklin, University of ChicagoTalk Title
Enabling Diverse and Interconnected Data Ecosystems with Lakehouses, Open Formats and AIAbstract
Data and AI are driving innovation across industry, science and society. At the same time, data management technology has changed dramatically, due a combination of factors such as cloud-based data lakes, the development of open standards for data formats and catalogs, and the opportunities presented by generative AI and other AI technologies. The result of all of these changes is an increasingly vibrant data ecosystem that has the potential to span traditional information silos and data incompatibilities to enable people and organizations to leverage all of their data in ways that have until now, simply have not been possible. In this talk I'll survey this new landscape and give an overview of some of the (to my mind) most promising directions for research and innovation in this rapidly advancing area.Biography
MICHAEL J. FRANKLIN is the Morton D. Hull Distinguished Service Professor of Computer Science and Founding Faculty Co-Director of the Data Science Institute at the University of Chicago. He is also a Founding Advisor and Visiting Researcher at Databricks, Inc. At Chicago he served as Liew Family Chair of Computer Science, overseeing the department's rapid expansion in scope and stature. Previously he was the Thomas M. Siebel Professor of Computer Science at UC Berkeley where he directed the Algorithms, Machines and People Laboratory (AMPLab), and was part of the original team building Apache Spark, a popular open source data processing framework initiated at the lab. He is a Member of the American Academy of Arts and Sciences and is a Fellow of the ACM and the American Association for the Advancement of Science. He has served on the Board of the Computing Research Association and the NSF CISE Advisory Committee as well as the ACM Fellows Selection Committee. He received the 2022 ACM SIGMOD Systems Award, and is a two-time recipient of the ACM SIGMOD “Test of Time” award. He holds a Ph.D. in Computer Sciences from the Univ. of Wisconsin (1993).Research/Interest Areas: Data systems, Data Science, Systems and AI/ML, Data Markets
---
Zoom: https://northwestern.zoom.us/j/92065267938
Panopto: https://northwestern.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=4eaec4b2-647d-4fba-8b8c-b22701674df8
DEI Minute: Allyship and Identity tinyurl.com/cspac-dei-minuteTIME Wednesday, December 4, 2024 at 12:00 PM - 1:00 PM
LOCATION 3514, Mudd Hall ( formerly Seeley G. Mudd Library) map it
CONTACT Wynante R Charles wynante.charles@northwestern.edu EMAIL
CALENDAR Department of Computer Science (CS)