Skip to main content

VET422 Practical use of artificial intelligence in biomedical research

Credits (ECTS):5

Responsible faculty:Veterinærhøgskolen

Course responsible:Andrew Michael Janczak

Campus / Online:Taught campus Ås

Teaching language:English

Limits of class size:50

Course frequency:Monthly meetings during teaching periods.

Nominal workload:Students are welcome to attend all meetings and work seminars throughout their PhD period, but they must participate in the aforementioned minimum of 10 meetings and associated work sessions. To fulfill the course's total workload of 125 hours, the independent work is standardized at 12.5 hours per meeting (including teaching time). This entails significant preparation and follow-up work in the form of documented preparation, practical testing of AI tools on own research-related tasks, oral presentations, and active participation. Students choose the meetings that are most relevant to their own research and work situation. The compulsory work can be completed within one year or spread over a longer period.

Teaching and exam period:The course is offered continuously throughout the year and has no fixed start or end date. Credits are awarded once the coursework has been completed and approved.

About this course

The course is organized as a seminar series with monthly meetings for PhD students and staff at NMBU. The goal is to build practical competence in the use of generative artificial intelligence as a support tool across academic activities.

The course provides an introduction to how language models work, their opportunities and limitations, and how they can be used responsibly in various parts of a researcher's daily work.

Topics include:

Prompt engineering and structuring instructions

AI as support for literature work and academic writing

AI for qualitative analysis of text and unstructured data

Pattern recognition and conceptual method development (including the design of research protocols and logical structuring of work processes)

Development of teaching materials and assessment tasks (including the legal distinction between generating tasks and the High-Risk activity of using AI to grade or evaluate students)

AI for administrative tasks such as grant writing, reporting, project planning, and navigating High-Risk rules in recruitment

AI in innovation processes and commercialization of research

Quality assurance and mandatory labeling of AI-generated content

Institutional and European guidelines for responsible use Privacy and data security Ethical considerations related to AI in academic practice

Furthermore, the course introduces agentic AI and autonomous workflows, covering the methodological complexities, the stricter requirements for manual quality assurance, and the legal implications of deploying such systems under the EU AI Act.

Learning outcome

Knowledge:

The candidate can explain how large language models work at a conceptual level and identify their main limitations, including hallucinations, biases, and a lack of genuine understanding.

The candidate is familiar with institutional and regulatory frameworks for the use of AI in research, teaching, and administration, including NMBU's guidelines and the EU AI Act (KI-forordningen). This includes understanding the strict legal requirements for High-Risk AI (such as student assessment and employment decisions) versus exempt scientific research AI.

The candidate knows where to find further support and resources for responsible AI use.

The candidate understands the methodological and regulatory distinctions between using standard AI models and autonomous AI agents, including how deploying agents may trigger stricter requirements for human oversight and alter legal roles (e.g., transitioning into a 'downstream provider') under the EU AI Act.

Skills:

The candidate can construct effective instructions (prompts) for a range of academic tasks, including literature reviews, writing support, data analysis, method development, the development of teaching materials, and administrative tasks such as grant writing and reporting. The candidate can critically evaluate AI-generated content, apply appropriate quality assurance procedures, and ensure compliance with transparency and labeling requirements for synthetic content. The candidate can configure AI tools for specific purposes using structured instruction sets. The candidate can independently assess when and how AI tools should be used in different parts of academic work, including when a Fundamental Rights Impact Assessment (FRIA) is required.

General competence:

The candidate can make informed decisions about the appropriate use of AI tools across the different phases of research, teaching, administration, and innovation. The candidate can contribute to discussions on responsible AI use within their academic environments. The candidate can critically assess the risks, benefits, and compliance requirements of integrating agentic AI into their workflows, taking full personal responsibility for processes that operate with a degree of autonomy. The candidate can stay updated on developments in AI tools and adapt their practice in line with new opportunities and guidelines.

  • Learning activities

    The course is organized as monthly researcher meetings, followed by work sessions reserved for PhD candidates and postdoctoral researchers. Topics are introduced and discussed with practical demonstrations and exercises, where researchers from all career stages are invited to participate in the first part. The subsequent work sessions are tailored for early-career researchers. These sessions will consist of demonstrations, practical exercises, and discussions, where students will have ample opportunity to explore tools and share experiences with researchers from other disciplines in an informal setting.
  • Teaching support

    All meetings and discussions are facilitated, but students are encouraged to participate in the planning and execution of the open meetings (e.g., suggesting topics or demonstrating their own applications).
  • Syllabus

    The syllabus is provided as a set of relevant resources and background material, and is published in Canvas before the meetings. The resources include institutional guidelines, selected articles, and documentation for relevant tools. Students choose the resources that are most relevant to them.
  • Prerequisites

    Master's degree.
  • Assessment method

    The course uses portfolio assessment. The portfolio consists of the student's written notes and reflections related to the testing of AI tools on their own research-related tasks. The contents of the portfolio are assessed as a whole as passed/failed. The student will not have their grade released or be awarded credits for the course until all required compulsory activities have been completed and approved.
  • About use of AI

    Full use of AI. The use of AI is permitted but must comply with NMBU's guidelines for the use of artificial intelligence (AI). The use of AI tools is central to this course. Students are expected to explore and apply AI tools as part of the learning activities. All use must be documented and reflected upon in accordance with the course's learning objectives and NMBU's guidelines for responsible AI use.

    Please note: When exploring and applying external AI tools that are not provided or formally assessed by NMBU's IT department, students must strictly adhere to NMBU's data classification rules. Only information classified as 'green/open' (non-sensitive data that can be shared freely) may be entered into these external tools. Furthermore, in accordance with the statutory principle of free higher education, students are not required to purchase premium subscriptions for any external AI tools. All compulsory activities and the portfolio assessment can be fully completed using either the free versions of external tools or AI tools currently licensed and provided by NMBU.

    Descriptions of AI-category codes.

  • Examiner scheme

    An external sensor approves the assessment scheme.
  • Mandatory activity

    Participation in a minimum of 10 researcher meetings and associated work sessions (a total of 20 hours). As part of the compulsory sessions, the student must conduct individual oral presentations, provide peer feedback, and prepare academic material. Compulsory activities are assessed as Approved / Not approved.
  • Notes

    Continuous enrollment
  • Teaching hours

    Approximately 2 hours a month. The time is agreed upon at the start of the semester.
  • Preferential right

    Priority is given to PhD candidates at VET.
  • Reduction of credits

    The course may overlap with other PhD courses that cover digital competence or research methodology where AI tools are included as part of the syllabus.