Skip to main content

Cornell University

Academic Innovation

Office of the Vice Provost for Academic Innovation (OVPAI)

GenAI Advisory Council

The rapid advancement of generative artificial intelligence (AI) is reshaping industries, and higher education is no exception. To harness this potential and mitigate associated risks, Cornell University has established the Generative AI Advisory Council, a group of cross-campus leaders that will provide strategic guidance on the ethical use of AI, ensure alignment with the university’s mission, and foster genAI collaboration and learning within and across education, research, and administration. Comprised of three key working groups, the council will focus on shaping policy, enhancing academic integrity, and supporting the innovative application of AI technologies within the university.

The council’s work will emphasize interdisciplinary collaboration, engaging a broad range of stakeholders—including faculty, students, and industry partners—to address the unique challenges and opportunities AI presents. From informing AI governance to fostering responsible research and advancing classroom innovation, the council will be instrumental in guiding the university’s approach to this transformative technology. Through a two-year pilot phase, the council will evaluate and refine its strategies to ensure Cornell remains at the forefront of AI integration in higher education.

The council’s work will emphasize interdisciplinary collaboration, engaging a broad range of stakeholders—including faculty, students, and industry partners—to address the unique challenges and opportunities AI presents. From informing AI governance to fostering responsible research and advancing classroom innovation, the council will be instrumental in guiding the university’s approach to this transformative technology. Through a two-year pilot phase, the council will evaluate and refine its strategies to ensure Cornell remains at the forefront of AI integration in higher education.

The Generative AI Advisory Council at Cornell will provide strategic guidance on AI initiatives. It will oversee pilots, projects, and technology implementation while producing regular reports for key academic and administrative entities. The council will foster a collaborative environment to support the responsible use of AI in education, research, and administration, and ensure alignment with the university’s goals and values. This includes promoting interdisciplinary innovation in classrooms, advancing cross-disciplinary research discussions, and optimizing AI’s role in administrative services. Engaging with stakeholders both inside and outside the university, the council will facilitate informed dialogue on AI’s opportunities and challenges, while managing potential risks associated with AI deployment in an academic setting.

Generative AI Advisory Council Education Working Group

Purpose

The council will advise on several elements of generative AI use across the Cornell system (including in its Ithaca, New York City, and Geneva locations).  The Clinical AI program (that Weill Cornell participates in with NYP and Columbia) is separate from but adjacent to the Advisory Council. Given the experimental and rapidly evolving nature of Generative AI and its use in Higher Ed, we recommend an initial two-year term of participation and then evaluate from there.  

Objectives of the AI Advisory Council and Sub/Working Groups for Education, Research & Administration

  1. Strategic Guidance: Provide strategic direction for generative AI initiatives, pilots, technology and personal services and projects ensuring alignment with the institution’s goals and values. 
  2. Reporting and Accountability: Produce regular reports and updates for academic and administrative entities including Faculty Senate, Deans Council, Provost Council, President Cabinet, and interested school or unit-based groups.
  3. Policy Development: Assist in the formulation of university policies related to the ethical use of generative AI, including issues of academic integrity, privacy, security, intellectual property and compliance with relevant laws and regulations. 
  4. Education/Teaching: Foster a collaborative environment to assess and support the appropriate use of generative AI in teaching and learning across a wide range of disciplinary fields and classroom environments; encouraging interdisciplinary efforts and innovation as well as appropriate awareness of the benefits and risks of generative AI in education; responding to GenAI challenges and opportunities in the classroom environment.
  5. Research and Innovation: foster collaborative and cross-disciplinary discussions around the appropriate and innovative use of AI across a range of disciplinary research fields, including in relation to the responsible conduct of research and research data security.   
  6. Administration: Foster a collaborative environment for generative AI use in supporting improved services, data access, efficiencies and staff education to optimize the use of generative AI, balance potential risks, and equip staff with new skills and knowledge.  
  7. Community Engagement: Engage with internal and external stakeholders, including students, faculty, industry partners, and policymakers, to promote informed dialogue on generative AI. 
  8. Risk Management: Identify and address potential risks and challenges associated with AI deployment in the academic setting. 

Members

  • Andrea Stevenson Won, Associate Professor, Department of Communication
  • Austin Bunn, Associate Professor in the Department of Performing and Media Arts and the Director of the Milstein Program in Technology and Humanity
  • Michael Clarkson, Provost’s Teaching Fellow & Senior Lecturer
  • Jenny Sabin,Professor; Chair of the Department of Design Tech; Arthur L. and Isabel B. Wiesenberger Professor in Architecture
  • William Lai, Assistant Research Professor, Molecular Biology and Genetics, Computational Biology
  • Rene Kizilcec, Associate Professor in the Ann S. Bowers College of Computing and Information Science
  • Becky Lane, Associate Director for Learning Technologies- Center for Teaching Innovation
  • Malte Ziewitz, Associate Professor and Director, Digital Due Process Clinic
  • Paul Krause, Vice Provost for External Education and Executive Director, eCornell
  • Michelle Trillium Crow, Senior Lecturer, Director, English Language Support Office
  • Doug Cohen, Director- Educational Computing
  • Amie Patchen, Environment, Climate & Health Concentration Chief, Lecturer
  • Rob Vanderlan, Executive Director- Center for Teaching Innovation
  • David Alan Goldberg, Associate Professor- Operations Research and Information Engineering
  • Deirdre Gobeille Snyder, Lecturer in Marketing and Management Communication
  • Tara Holm, Professor- Mathematics
  • Rebecca Joffrey, IT Innovation Officer
  • Thurayya Arayssi, Senior Associate Dean for Medical Education and CPD Professor of Clinical Medicine Weill Cornell Medicine-Qatar
  • Carol Grumbach, Academic Programs and Policy Consultant
  • Amy Cheatle, Instructional Designer- Center for Teaching Innovation
  • Lain Nelson, Student Representative
  • Ayham Boucher, Head of AI Innovations for Cornell Information Technologies (CIT) at the Ithaca campus and Information Technologies & Services (ITS) at Weill Cornell Medicine
  • Liang Shen lis9128@med.cornell.edu – Assistant Professor of Clinical Anesthesiology at Weill Cornell Medicine