Secure Your AI: Threat Modeling is an intermediate course for architects and engineers tasked with protecting complex AI systems. This course moves beyond reactive security, teaching you to build resilience directly into your designs. You will master the critical architectural decision of secret management by performing a deep-dive comparison of self-hosted solutions like Vault and managed cloud services like AWS Secrets Manager. You will learn to create a full Total Cost of Ownership (TCO) analysis and use compliance and performance data to make a justifiable, portfolio-ready recommendation.

Gain next-level skills with Coursera Plus for $199 (regularly $399). Save now.

Secure Your AI: Threat Modeling
This course is part of Agentic AI Development & Security Specialization

Instructor: LearningMate
Included with
Recommended experience
What you'll learn
Learners will apply threat modeling and architectural analysis to select secret management solutions and mitigate risks in AI systems using STRIDE.
Skills you'll gain
Details to know

Add to your LinkedIn profile
January 2026
See how employees at top companies are mastering in-demand skills

Build your subject-matter expertise
- Learn new concepts from industry experts
- Gain a foundational understanding of a subject or tool
- Develop job-relevant skills with hands-on projects
- Earn a shareable career certificate

There are 2 modules in this course
This module tackles the critical architectural decision of secret management in AI systems. You will first understand the “why”—the security and operational necessity of a dedicated secret store. You will then learn the “what” of the two primary models: on-premise self-hosted (like Vault) versus managed cloud services (e.g., Secrets Manager). Thereafter, the focus shifts to the “how” of comparing them across TCO, compliance, and operational overhead, and learning how to structure a professional recommendation, culminating in the creation of a justified, portfolio-ready technical recommendation.
What's included
2 videos2 readings3 assignments
You have secured your system's secrets. Now, it is time to proactively secure its design. This module shifts focus to threat modeling—the systematic process of securing an AI system by design. You will learn the why behind this proactive approach. You will then master the how of deconstructing an architecture into data flows and trust boundaries and the structured what of applying the STRIDE framework to methodically identify and mitigate risks. The module culminates in the “apply” task of creating a real-world threat model for an agent system.
What's included
2 videos3 readings3 assignments
Earn a career certificate
Add this credential to your LinkedIn profile, resume, or CV. Share it on social media and in your performance review.
Instructor

Offered by
Explore more from Design and Product
Status: Free Trial
Status: Free Trial
Status: PreviewBoard Infinity
Status: Free TrialMacquarie University
Why people choose Coursera for their career




Frequently asked questions
To access the course materials, assignments and to earn a Certificate, you will need to purchase the Certificate experience when you enroll in a course. You can try a Free Trial instead, or apply for Financial Aid. The course may offer 'Full Course, No Certificate' instead. This option lets you see all course materials, submit required assessments, and get a final grade. This also means that you will not be able to purchase a Certificate experience.
When you enroll in the course, you get access to all of the courses in the Specialization, and you earn a certificate when you complete the work. Your electronic Certificate will be added to your Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile.
Yes. In select learning programs, you can apply for financial aid or a scholarship if you can’t afford the enrollment fee. If fin aid or scholarship is available for your learning program selection, you’ll find a link to apply on the description page.
More questions
Financial aid available,
¹ Some assignments in this course are AI-graded. For these assignments, your data will be used in accordance with Coursera's Privacy Notice.




