Heroku Integration Capabilities: The Mini Guide

Heroku Connect, Apache Kafka on Heroku, MuleSoft’s Kafka Connector, Streaming Data Connectors, Heroku Flow and more.

Dave Norris
Geek Culture

--

In a previous article we discussed the interconnected pieces that make up the Salesforce Platform and summarised the APIs and capabilities.

In this post we’ll double click on Heroku to look at its integration capabilities in more depth.

Heroku APIs and Key Integration Capabilities

Heroku Connect

Heroku Connect makes it easy for you to build Heroku Apps that share data with your Salesforce deployment.

There are 2 core capabilities of Heroku Connect.

1. You can replicate data from Salesforce Lightning Platform and a managed Heroku Postgres database — bi-directionally.

For example, if you create an application using Ruby on Rails and host it on Heroku you can take data you’re capturing in that application and replicate it into the Lightning Platform (and vice versa) where you can add declarative workflows and processes for your front and back offices.

This facilitates use cases that combine the capabilities of the Lightning Platform and Heroku without having to write code.

2. Keep the data stored in your Heroku Postgres database but create, read, update and delete it directly from Salesforce. Heroku External Objects makes tables in your Postgres database available via an OData Provider with simple point and click setup and configuration.

The steps needed are pretty simple.

  1. Create a Heroku App and provision the Add-Ons for Heroku Postgres and Heroku Connect
  2. Connect to your core Salesforce deployment and authorise the connection
  3. Choose the standard and custom objects, and the fields within them to sync with your Postgres database
  4. Optionally choose tables in your Postgres database to make available via an OData provider

Tables in your Heroku Postgres database appear as External Objects in the Lighting Platform — this maintains a user experience largely indistinguishable from data stored on platform.

This facilitates use cases that enable virtualisation of data across system boundaries. Data can be interacted with without the integration challenges associated with replicating it across 2 different databases.

Apache Kafka on Heroku

Apache Kafka on Heroku is an add-on that provides Kafka as a service with full integration into the Heroku platform.

Kafka provides the messaging backbone for building distributed applications capable of handling billions of events and millions of transactions, and is designed to move large volumes of ephemeral data with a high degree of reliability and fault tolerance.

Kafka resources are distributed across network zones for fault-tolerance and different plans cater for highly secure use cases — offering up to 8 Kafka brokers, minimum data retention of 6 weeks and HIPAA compliance.

Apache Kafka on Heroku build a distributed, scalable, secure Kafka messaging backbone in minutes.

MuleSoft’s Apache Kafka Connector

Connecting events in Kafka on Heroku to other Salesforce jigsaw pieces can be challenging. There is no direct way to subscribe to Kafka events from the Lightning Platform or Marketing Cloud — a grey area requiring some code.

There is currently no way to directly subscribe to Kafka events in other parts of the Salesforce platform.

To bridge this gap MuleSoft provides an Apache Kafka connector that can connect many different types of application- including the Salesforce jigsaw pieces.

MuleSoft has connectors that can bridge between systems whilst monitoring those APIs

Creating these types of integrations can be done using a free trial of MuleSoft’s Anypoint Platform and Anypoint Studio.

Streaming Data Connectors

Streaming Data Connectors make Change Data Capture (CDC) possible on Heroku with minimal effort. Anyone with a Private or Shield Space, as well as a Postgres and an Apache Kafka add-on in that space, can use Streaming Data Connectors at no additional charge.

Build streaming data pipelines between Salesforce and external stores like a Snowflake data lake or an AWS Kinesis queue for integration with other data sources. Refactor monoliths into microservices, implement an event-based architecture, archive data in lower-cost storage services, and more.

Other uses of connectors is to build a unified event feed from data in multiple Salesforce and Work.com orgs, which provides a centralised Kafka-based Event Bus to take action on all org activity — this is possible since Heroku Connect provides the easy to setup bi-directional sync and stores that data in a Heroku Postgres database.

Heroku Flow

Heroku Flow supports an optimised development release path. Let’s walkthrough a typical release path to highlight key Heroku capabilities.

Heroku Pipelines lets you setup stages for your software development lifecycle — review, dev, staging and prod. Whilst you could promote code from one stage to the next manually — you can also connect your pipeline with a GitHub repository and have it automated.

So let’s pretend we’re a new developer to a team and take a look at what the flow would look like by using the animation above.

Step 1: First I would checkout the main branch

Step 2: Then I would create a feature branch for a change I wanted to make

Step 3: I would test this change locally and then commit my change back to GitHub

Step 4: Then I would create a pull request — At this point the Heroku Pipeline picks up the pull request and automatically creates a Review App — Review Apps are used to propose, discuss, and decide whether or not to merge changes to your code base. It also optionally runs Heroku CI a low configuration test runner. Heroku CI runs your test scripts with zero queue time using apps that have string parity with your staging and production versions.

Step 5: All subsequent commits to GitHub have a new Review App created.

Step 6: Next up is typically a code review. With Heroku ChatOps developers can keep track of code changes. Pull request notifications, merges and CI builds all show up in Slack.

Step 7: Assuming the review is okay and my CI tests are passing I then merge my updates back into main. This fires another automated build to create a staging application. This now allows further testing prior to promoting to production.

Step 8: Finally the development team can promote the code to production. Using Heroku’s Release Phase developers can run tasks before a release moves to production eliminating maintenance windows and reducing deployment risk. Migrate a database, upload assets to a CDN, invalidate a cache or run any other task your app needs.

The promotion can be manual using the Heroku dashboard and Heroku Pipelines or completed through Heroku ChatOps using Slack commands.

The goal of Heroku Flow is to make deploying high quality changes and monitoring them as easy as possible.

Elements Marketplace

The Heroku Elements Marketplace offers easily-integrated technical solutions that support multiple stages of app development and operation. Some elements are created and managed by Heroku, while others are contributed by ecosystem partners, open source communities, or individual developers.

Use add-ons like CloudAMQP to automate the setup, scaling and running of RabbitMQ clusters. It gives your developers in-order, no duplicate guarantees and high availability baked in — or PubNub to host real-time APIs to build engaging remote experiences using in-app chat, push notifications, location tracking and more but using their robust Data Stream Network.

Heroku Elements marketplace lets you focus on building applications instead of running and operating complex integration infrastructure.

Summary

For custom applications in your enterprise — or net new applications — requiring the full elasticity of B2C scale and the use of existing, modern, programming languages like Go, Scala, Ruby, Python, Node.js and more — Salesforce provides the ability to host these applications and their corresponding data services on fully managed infrastructure. Being able to leverage your existing developer skill set and scale vertically and horizontally on demand opens up new integration use cases. Integration services that run a fully managed Kafka instance or bi-directionally sync data to the Lightning Platform make data exchange scalable and easy to setup.

--

--

Dave Norris
Geek Culture

Developer Advocate @ Salesforce || Interested in solving unique challenges using different cloud service providers || All opinions are mine.