Azure service bus offers relay service. Azure relay service enables you to expose your on-premise services to cloud.
This is possible without opening firewall port or without making any network configurations. That’s how Azure relay service is different from traditional network level integration technologies like VPN.
Azure relay service can scope this network traffic to only one endpoint or only one application. On the other hand, VPN is more network level configuration and multiple applications can use those configurations.
In this article, we will quickly have a look at some of the basic concepts and high level overview of the relay service.
For understanding basic flow, let’s understand the scenario for which the basic flow steps are given below. In this scenario, there are two on-premise applications (a client and a service) which will be connected through relay service.
Service Listening to particular address
Initially, an on-premises service connects to the relay service through an outbound port. It creates a bi-directional socket for communication tied to a particular address.
Client Interacts with Service through relay
The client can then communicate with the on-premises service by sending traffic to the relay service targeting that address.
The relay service then relays data to the on-premises service through the bi-directional socket dedicated to the client.
No Firewall Configurations
The client doesn’t need a direct connection to the on-premises service. It doesn’t need to know the location of the service. And, the on-premises service doesn’t need any inbound ports open on the firewall.
Relay service has two features
- Hybrid Connections – uses open standard web sockets, hence, can be used for multi-platform integration scenarios.
- WCF Relays – uses WCF to enable remote procedure calls
Hybrid Connections enables bi-directional, binary stream communication and simple datagram flow between two networked applications. Either or both parties can reside behind NATs or firewalls.
The hybrid connections service endpoint allows for relaying Web Socket connections and HTTP(S) requests and responses.
The programs on both side of relay service are called as clients – as they are client to the relay service. The client which waits and accepts connections is called as listener (or listener role) while the client which initiates connection to listener via relay service is called as sender (or sender role).
There are five types of interactions for listener:
To indicate readiness to the service that a listener is ready to accept connections, it creates an outbound WebSocket connection. The connection handshake carries the name of a Hybrid Connection configured on the Relay namespace, and a security token that confers the “Listen” right on that name.
When the WebSocket is accepted by the service, the registration is complete and the established WebSocket is kept alive as the “control channel” for enabling all subsequent interactions.
When a sender opens a new connection on the service, the service chooses and notifies one of the active listeners on the Hybrid Connection. This notification is sent to the listener over the open control channel as a JSON message. The message contains the URL of the WebSocket endpoint that the listener must connect to for accepting the connection.
In addition to WebSocket connections, the listener can also receive HTTP request frames from a sender, if this capability is explicitly enabled on the Hybrid Connection.
A security token is used to register the listener. The expiry of this token would also cause control channel to be dropped.
The “renew” operation is a JSON message that the listener can send to replace the token associated with the control channel, so that the control channel can be maintained for extended periods.
If control channel is idle for longer duration, other intermediaries like NAT, load balancers, may drop TCP connection. The ping message is to remind these services to keep the channel alive. If ping fails, the listener should reconnect to create new channel.
The “connect” operation opens a WebSocket on the service, providing the name of the Hybrid Connection and (optionally, but required by default) a security token conferring “Send” permission in the query string. The relay service then interacts with listener to create a common connection that is joined using this web socket. After web socket is accepted, all further interactions on that web socket are with the connected listener.
For Hybrid Connections for which the feature has been enabled, the sender can send largely unrestricted HTTP requests to listeners.
The relay service facilitates this by enabling you to securely expose WCF services that reside within a corporate enterprise network to the public cloud, without having to open a firewall connection, or requiring intrusive changes to a corporate network infrastructure.
Azure Relay enables you to host WCF services within your existing enterprise environment. You can then delegate listening for incoming sessions and requests to these WCF services to the relay service running within Azure.
This enables you to expose these services to application code running in Azure, or to mobile workers or extranet partner environments. Relay enables you to securely control who can access these services at a fine-grained level.
In this article, we have discussed, some of the basic concepts in Azure relay service. I hope this article provided you idea on what relay service is and how it can be used. Please do comment and let me know your thoughts and experiences.