Next.js Supabase Authentication: Complete Guide - Part 4
Secure your Supabase database with Prisma ORM, PostgreSQL triggers, and Row Level Security. Complete guide to database management and RLS policies.
In this series

The Day I Learned About Row Level Security
My authentication worked perfectly. Users could sign up, log in, everything. Then I realized something terrifying: with the public anon key, anyone could query my database directly. Sure, they needed to be authenticated, but authenticated users could see everyone's data.
That's when I discovered Row Level Security (RLS). It's PostgreSQL's way of saying "this row belongs to this user, and only they can access it." Game-changer.
In this final part of our series, we'll build a complete profile system using Prisma ORM, create database triggers that automatically generate profiles when users sign up, and secure everything with Row Level Security policies.
🔗 Check out the full code for this series here
Why We Need More Than Just Auth Tables
Supabase's auth system gives us basic user information: email, user ID, authentication metadata. But most apps need more:
- User profiles (name, bio, avatar)
- User preferences and settings
- App-specific data tied to users
We could stuff this into the auth metadata, but that's messy and limited. Better approach: a separate profiles table with proper relationships and security.
Setting Up Prisma ORM
Prisma makes working with PostgreSQL feel natural. Instead of writing SQL, we define our schema in a clean syntax, and Prisma handles the rest.
Installing Prisma
pnpm add @prisma/client
pnpm add -D prisma
The @prisma/client is what we'll use in our code. The prisma dev dependency gives us CLI tools for migrations.
Getting Your Database Connection String
We need to connect Prisma to our Supabase database. In your Supabase dashboard:
- Select your project
- Click "Database" in the left sidebar
- Click "Connect" at the top

You'll see a connection string that looks like:
postgresql://postgres:[YOUR-PASSWORD]@db.xxxxxxxxxxxx.supabase.co:5432/postgres
Copy this URL and update your .env.local file:
DATABASE_URL=postgresql://postgres:[YOUR-PASSWORD]@db.xxxxxxxxxxxx.supabase.co:5432/postgres?pgbouncer=true
DIRECT_URL=postgresql://postgres:[YOUR-PASSWORD]@db.xxxxxxxxxxxx.supabase.co:5432/postgres
Important details:
- Replace
[YOUR-PASSWORD]with your actual database password - The
DATABASE_URLhas?pgbouncer=trueappended (for connection pooling) - The
DIRECT_URLis the same without pgbouncer (needed for migrations)
If you forgot your database password, reset it in Project Settings > Database > Reset Database Password. This won't affect your Supabase configuration.
Creating the Prisma Schema
Create a folder named prisma in your project root, and inside it, create schema.prisma:
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
directUrl = env("DIRECT_URL")
}
model Profile {
id String @id @default(dbgenerated("gen_random_uuid()")) @db.Uuid
firstName String? @map("first_name")
lastName String? @map("last_name")
email String @unique
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("profiles")
}
Let me break this down:
Understanding the Schema
generator client: Tells Prisma to generate a JavaScript/TypeScript client we can use in our code
datasource db: Configures the database connection using our environment variables
model Profile: Defines the profiles table structure
Field details:
id: UUID primary key, auto-generated by PostgreSQLfirstNameandlastName: Optional strings (nullable with?)email: Required, unique—we'll sync this with Supabase authcreatedAt: Automatically set when the record is createdupdatedAt: Automatically updated whenever the record changes
@map(): The database column name. We use snake_case in the database but camelCase in our code. Prisma handles the translation.
@@map("profiles"): The actual table name in PostgreSQL
Quick Prisma Tip
Install the Prisma VS Code extension for syntax highlighting and autocomplete. It makes working with Prisma schemas much nicer.
Running Database Migrations
Migrations are how we transform our Prisma schema into actual database tables. Let's set up a convenient script.
Add this to your package.json under scripts:
{
"scripts": {
"migrate": "prisma migrate dev --name init && prisma generate"
}
}
This script does two things:
- Creates a migration and applies it to the database
- Generates the Prisma client code we'll use in our app
Run it:
pnpm migrate
You should see output showing Prisma creating the migration and applying it. The --name init part names this migration "init" (you can use any descriptive name).
Verifying the Table
Go back to your Supabase dashboard:
- Click "Database" in the left sidebar
- Click "Tables"
- You should see a
profilestable!

If you click on it, you'll see the columns we defined in our schema. Beautiful.
Creating Automatic Triggers
Right now, our profiles table exists, but it's empty. We need profiles to be created automatically when users sign up. PostgreSQL triggers are perfect for this.
A trigger is a function that runs automatically when something happens in the database. We'll create two:
- Create Profile Trigger: Runs when a new user signs up
- Delete Auth Trigger: Runs when a profile is deleted (to clean up auth records)
Understanding the Flow
Here's what will happen:
On Signup:
- User signs up (via email or Google)
- Supabase creates a record in
auth.users - Our trigger fires
- Trigger creates a corresponding record in
public.profiles
On Profile Delete:
- User's profile is deleted from
public.profiles - Our trigger fires
- Trigger deletes the corresponding record from
auth.users
This keeps everything in sync automatically.
Writing the Trigger SQL
Go to your Supabase dashboard:
- Click "SQL Editor" in the left sidebar
- Click "New query"

Paste this SQL and click "Run":
-- Function to create profile when user signs up
create or replace function public.on_user_signup()
returns trigger as $$
begin
insert into public.profiles (id, email, updated_at, first_name, last_name)
values (
new.id,
new.email,
now(),
NEW.raw_user_meta_data->>'first_name',
NEW.raw_user_meta_data->>'last_name'
);
return new;
end;
$$ language plpgsql security definer;
-- Trigger to run function when new user is created
create or replace trigger create_user_profile
after insert on auth.users
for each row execute procedure public.on_user_signup();
-- Function to delete auth user when profile is deleted
create or replace function public.on_profile_delete()
returns trigger as $$
begin
delete from auth.users where id = old.id;
return new;
end;
$$ language plpgsql security definer;
-- Trigger to run when profile is deleted
create or replace trigger delete_auth_user
after delete on public.profiles
for each row execute procedure public.on_profile_delete();
Understanding the Triggers
create or replace function: Defines a PostgreSQL function. The or replace means we can safely run this multiple times.
returns trigger: This is a special function type that's called by triggers.
new.id and new.email: In trigger functions, new refers to the new row being inserted.
raw_user_meta_data: Supabase stores extra signup data here. We're extracting first and last names if provided.
security definer: This function runs with the privileges of the user who created it (not the user who triggered it). Necessary for accessing the auth schema.
after insert on auth.users: Run this trigger after a new user is inserted.
for each row: Run once for each affected row.
old.id: In delete triggers, old refers to the row being deleted.
Testing the Triggers
Let's test this:
- Sign up with a new account (use a different email than before)
- Go to Supabase Dashboard > Database > Tables
- Look at the
profilestable - You should see a new profile with your email!

The trigger worked! When Supabase created your auth record, our trigger automatically created your profile.
Testing the Delete Trigger
Try deleting a profile:
- In the Table Editor, delete one of the profile rows
- Go to Authentication > Users
- That user should be gone from the auth table too
Both directions work automatically. Magical.
Implementing Row Level Security
Here's where we lock down our database. Row Level Security (RLS) is PostgreSQL's way of restricting which rows users can access.
Why RLS Matters
Remember that anon public key in your .env file? It's public by design. Users need it to communicate with Supabase from the browser. But without RLS, anyone with that key could query your database directly using Supabase's JavaScript client.
With RLS enabled, PostgreSQL checks every query. Even if someone has your anon key, they can only access rows their policies allow.
Understanding RLS Policies
An RLS policy is like a WHERE clause that PostgreSQL automatically adds to every query. For example:
CREATE POLICY read_own_profile ON profiles
FOR SELECT
USING (auth.uid() = id);
This policy says: "For SELECT queries on the profiles table, only return rows where the user's ID matches the row's ID."
The auth.uid() function returns the currently authenticated user's ID. Supabase sets this automatically based on the JWT token in the request.
Enabling RLS and Creating Policies
Go to your SQL Editor in Supabase and run this:
-- Enable RLS on profiles table
ALTER TABLE profiles ENABLE ROW LEVEL SECURITY;
-- Policy: Users can read their own profile
CREATE POLICY read_own_profile ON profiles
FOR SELECT
USING (auth.uid() = id);
-- Policy: Users can update their own profile
CREATE POLICY update_own_profile ON profiles
FOR UPDATE
USING (auth.uid() = id);
-- Policy: Users can delete their own profile
CREATE POLICY delete_own_profile ON profiles
FOR DELETE
USING (auth.uid() = id);
What Each Policy Does
read_own_profile: When a user queries the profiles table, they only see their own profile. Even if they write a query like SELECT * FROM profiles, PostgreSQL only returns their row.
update_own_profile: Users can only update profiles where id matches their user ID. Attempts to update other profiles fail silently (the update just doesn't happen).
delete_own_profile: Users can only delete their own profile.
Verifying RLS is Enabled
Go to Database > Tables in your Supabase dashboard. Look at the profiles table. The lock icon should now be closed, indicating RLS is enabled.

Testing RLS
Let's verify it works:
- Log in to your app as one user
- Open browser dev tools
- Try to query all profiles using the Supabase client:
const supabase = createClient();
const { data, error } = await supabase.from('profiles').select('*');
console.log(data);
You should only see your own profile, even though you queried all profiles. That's RLS working perfectly.
Understanding the Security Model
Let me explain the full security picture we've built:
Layer 1: Authentication
Users must be authenticated to access your app. Supabase verifies their JWT token on every request.
Layer 2: Route Protection
Our Next.js middleware redirects unauthenticated users away from protected routes before they even see the page.
Layer 3: Row Level Security
Even if someone bypasses the frontend, PostgreSQL ensures they can only access their own data. This is your last line of defense.
Layer 4: Type Safety
Prisma gives us type-safe database access. If we try to query a column that doesn't exist, TypeScript catches it at compile time.
This is defense in depth. Multiple layers protecting your users' data.
Best Practices for RLS
Always Enable RLS on User Data
Any table containing user-specific data should have RLS enabled. If you forget, anyone can access anyone's data.
Use Helper Functions
For complex policies, create SQL functions:
CREATE FUNCTION is_profile_owner(profile_id UUID)
RETURNS BOOLEAN AS $$
BEGIN
RETURN auth.uid() = profile_id;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
CREATE POLICY update_own_profile ON profiles
FOR UPDATE
USING (is_profile_owner(id));
This makes policies more readable and reusable.
Test Your Policies
Always test RLS policies with different users. Try to access data you shouldn't be able to see. If you can, your policy has a hole.
Server-Side Bypasses
When using Supabase's service role key (not the anon key), RLS is bypassed. This is useful for admin operations but dangerous. Only use the service role key in secure server-side code, never in the browser.
When NOT to Use RLS
RLS is perfect for user-specific data, but it's not ideal for:
- Public data: If everyone should see it, don't waste CPU cycles on RLS checks
- Complex permissions: If your permission logic is very complex, handle it in your application code instead
- Performance-critical queries: RLS adds overhead. For tables with millions of rows, profile your queries
For our authentication system, RLS is perfect. Each user has one profile, and they should only access their own.
Using Prisma in Your App
Now that everything's set up, let's use Prisma to query profiles. Create utils/prisma.ts:
import { PrismaClient } from '@prisma/client';
const globalForPrisma = global as unknown as { prisma: PrismaClient };
export const prisma = globalForPrisma.prisma || new PrismaClient();
if (process.env.NODE_ENV !== 'production') globalForPrisma.prisma = prisma;
This creates a single Prisma client instance that's reused across requests. Creating a new client every time would exhaust database connections.
Example: Fetching User Profile
In a Server Component or Server Action:
import { prisma } from '@/utils/prisma';
import { createClient } from '@/utils/supabase/server';
export async function getUserProfile() {
const supabase = await createClient();
const { data: { user } } = await supabase.auth.getUser();
if (!user) return null;
const profile = await prisma.profile.findUnique({
where: { id: user.id }
});
return profile;
}
Notice how Prisma gives us full TypeScript types. profile has proper types for firstName, lastName, etc.
What We've Accomplished
Let's recap this entire series. We built a complete, production-ready authentication system:
Part 1: Created beautiful authentication forms with ShadCN UI, React Hook Forms, and Zod validation
Part 2: Integrated Supabase, created server actions for email authentication, and built middleware for route protection
Part 3: Added Google OAuth for seamless social login
Part 4 (Today):
- ✅ Set up Prisma ORM for type-safe database access
- ✅ Created a profiles table with automatic migrations
- ✅ Built PostgreSQL triggers for automatic profile creation
- ✅ Implemented Row Level Security to protect user data
- ✅ Created a complete, secure data model
This is enterprise-grade authentication. You can deploy this to production with confidence.
Going Further
Want to extend this system? Here are some ideas:
Add More OAuth Providers
Supabase supports GitHub, GitLab, Bitbucket, Azure, and more. The setup is similar to Google—create an OAuth app, get credentials, configure Supabase.
Email Verification
Add email verification requirements before users can access certain features. Supabase provides this in authentication settings.
Two-Factor Authentication
Supabase supports 2FA. Enable it in your project settings and add the UI for users to set it up.
Audit Logs
Create a profiles_history table and use triggers to log all changes to user profiles. Great for compliance and debugging.
Profile Pictures
Add an avatar_url field to profiles and use Supabase Storage for image uploads. RLS works for storage too!
Final Thoughts
Building authentication used to take weeks. We just did it in four blog posts.
The stack we used—Next.js, Supabase, Prisma—works incredibly well together. Each piece handles what it's best at:
- Next.js: Server-side rendering and routing
- Supabase: Authentication and PostgreSQL hosting
- Prisma: Type-safe database access
- PostgreSQL: Rock-solid data storage with RLS
I've used this exact stack for multiple production applications. It scales well, performs great, and the developer experience is fantastic.
Thank you for following this series. I hope you build something amazing with what you've learned.
🔗 Check out the full code for this series here
Authentication is just the beginning. If you found this series helpful, share it with others building web applications. Connect with me on Twitter or LinkedIn to see what I build next.
Support My Work
If this guide helped you with this topic, I'd really appreciate your support! Creating comprehensive, free content like this takes significant time and effort. Your support helps me continue sharing knowledge and creating more helpful resources for developers.
☕ Buy me a coffee - Every contribution, big or small, means the world to me and keeps me motivated to create more content!