Since its launch in November 2024, there’s been a growing buzz around Model Context Protocol (MCP) — modelcontextprotocol.io. According to their docs: "MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications." In short, it’s a way to structure how you feed context and logic into LLM applications.

Most examples of MCP usage I’ve seen so far are isolated or local — not built for scale. Our primary use case for MCP was within Cursor, an AI-powered IDE (Integrated Development Environment). You can access an MCP server via an SSE (Server-Sent Events) transport connection in Cursor; the same applies to Claude Desktop and several other clients. However, the server must be run locally, and you must specify a localhost port for the SSE connection.

At the time we created this hosting solution, there was no official or unofficial support for authentication or authorization. According to their roadmap, those features were expected sometime in H1 2025. As of today (April 10, 2025), there is still no proper solution outlined in the MCP documentation for authentication or authorization in SSE transport.

Below is an image of the recommendations for Authentication, Authorization, and Network Security from the MCP documentation.

The Problem

MCP allows you to run an SSE transport server that integrates seamlessly with developer tools like Cursor. But due to the way MCP communicates, you can't pass authentication headers or tokens reliably — user-based access control isn’t built-in.

So unless you're the only one using the server, or you're okay exposing it publicly, there's no clean, secure way to share it over the public internet.

Work around (Standard Input/Output)

You can avoid SSE and use a standard input/output (stdio) transport instead. But this approach requires each user to have the MCP server’s source code locally and execute it via a specified path. This introduces problems:

  • Version drift between users

  • Security concerns if you don’t want to expose all source code

  • More setup friction

Our Network Layer Solution

We’re using Google Cloud Run with IAM-based authentication. Essentially, we host the MCP server on Cloud Run, lock it down with IAM-based network permissions, and then use a local proxy that authenticates through the Google Cloud SDK and tunnels requests to the Cloud Run service via localhost. All SSE connections flow through this proxy.

It’s a neat, simple workaround that lets us safely share the server without exposing it publicly — no passwords, no keys, no guesswork. Just Google Cloud IAM (Identity and Access Management).

We run a separate MCP server for each of our clients, which integrates with FreeTech Code — our AI Code Assistant that has access to client goals and specific deliverables defined in our portal. This way, every developer has access only to the client servers they work on, since there isn’t currently a way to set up fine-grained user access permissions with this network-layer-based solution.

Why This Works

  • Cloud Run allows you to enforce IAM-based access — meaning only authenticated users (via Google Cloud SDK) can hit the endpoint.

  • We use npx ts-node mcp_proxy.ts to create a local proxy that handles auth and forwards traffic to Cloud Run.

  • You configure your tool (Cursor, Claude Desktop, etc.) to point to http://localhost:3030 — your local proxy.

That’s it. Now your team can securely use a remote MCP server as if it’s running locally.

Who Is This For?

This setup is ideal for dev teams that want a shared, reliable MCP endpoint powering their LLM workflows. Whether it’s in Cursor, Claude Desktop, or any other tool that supports SSE, this lets you use a single hosted MCP server across your team — without compromising security.

For us, this is what powers FreeTech Code, our internal developer tooling suite that wraps LLM-based coding assistance into a streamlined experience. You can read more about FreeTech Code on our blog.

Spin it up in less than 1 hour

We’ve included a ready-to-go setup in the repo if you want to try this out yourself. Instructions are dead simple — just a couple of shell edits and you’re up. Github: https://github.com/the-freetech-company/mcp-sse-authenticated-cloud-run

If you have questions or are a non-technical founder who would like to integrate MCP into your organization, schedule a free consultation: https://freetech.co/consultation

More Resources

Technology

FreeTech's SMS AI Chat Bot

Jun 5, 2025

Technology

FreeTech's SMS AI Chat Bot

Jun 5, 2025

Technology

FreeTech's SMS AI Chat Bot

Jun 5, 2025

DevOps

Enterprise Grade Observability

Apr 2, 2025

DevOps

Enterprise Grade Observability

Apr 2, 2025

DevOps

Enterprise Grade Observability

Apr 2, 2025

Engineering

Case Study: Optic Truck Works

Dec 18, 2024

Engineering

Case Study: Optic Truck Works

Dec 18, 2024

Engineering

Case Study: Optic Truck Works

Dec 18, 2024