All case studies

How a Healthcare Communications Firm Regained Cost Control Over Large-Scale Data Movement

Eliminating volatile data costs to enable real-time campaign optimization

60%
reduction in cohort delivery costs
Fixed, predictable pricing replaced per-run cost spikes
Terabyte-scale cohort data moved in days instead of months

Industry Context: The Cost of Keeping Data Current

In healthcare marketing, data is in constant motion as audience lists change, campaigns shift and media spend requires regular tuning. Lots of teams are still stuck with data pipelines that make every refresh expensive and slow. Usage-based pricing, cloud fees and warehouse compute costs force unfortunate tradeoffs—refresh less often, or pay more than planned. As datasets grow, those tradeoffs only get harder to justify.

When Data Spend Limits Campaign Performance

A healthcare communications firm was building a personalized healthcare professional (HCP) engagement platform for a pharmaceutical client—something that could adjust outreach based on real-time physician behavior. The platform concept worked, but the data infrastructure behind it didn't.

The team was sourcing large cohort-based datasets from Purple Labs and moving them into Snowflake to inform advertising strategy and media spend. They used Fivetran for data movement. Costs escalated fast. In one instance, a single ingestion effort generated a bill of roughly $24,000, forcing the team to shut it down.

And it kept happening. Fivetran's Monthly Active Row (MAR) pricing meant costs climbed every time data volumes increased, which is exactly what happens when campaigns succeed or expand. Budgeting became impossible as scaling campaigns suddenly carried real financial risk. Cloud egress fees added to the damage, chipping away at margins with every large data transfer.

Then there was the transformation work. The firm relied on licensed tooling to clean and prepare data before it could be used, which added cost and slowed everything down. Refresh cadence took a hit. The team wanted to update cohort data regularly to keep campaigns tuned to current performance, but the existing setup made frequent refreshes financially prohibitive.

For a platform built around making timely decisions, operating on stale data was a fundamental problem.

Replacing Fragmented Tooling with Fixed Costs

The firm brought in EASL to replace the expensive, fragmented setup. EASL offered what the existing setup couldn't: fixed costs that didn't penalize frequent refreshes, and the ability to handle data transformations before they ever hit Snowflake.

With EASL in place, the firm eliminated several cost and operational constraints:

  • Data consumption-based billing: EASL's pricing didn't fluctuate based on data volume. The team could run the workflow five times or 50 times per month—the cost stayed the same. That financial certainty made it possible to plan campaign budgets without worrying about surprise data bills.
  • Cloud egress fees: By optimizing the architecture, EASL drastically reduced the costs associated with moving data in and out of the cloud. Terabyte-scale cohort data could move without triggering the punitive fees that had plagued the previous setup.
  • Alteryx transformations: EASL's platform included native transformation capabilities. The team could run their "bronze to silver" transformations—converting raw data into cleaner, more usable formats—directly within the pipeline. That shift had a cascading effect on costs. Snowflake charges for storage and compute. By eliminating the bronze layer storage and the compute required to run transformations inside Snowflake, the firm saved on both fronts.
  • ​​Built-in governance and traceability: Every data movement and transformation was logged as part of the pipeline, creating a clear audit trail. For a life sciences organization, that traceability added confidence that data handling aligned with regulatory expectations without introducing additional operational overhead.

Faster Refreshes and Fewer Tradeoffs 

The most immediate win was speed paired with cost certainty.

With EASL in place, the team could refresh large datasets on a cadence that better aligned with how campaigns are evaluated and adjusted. Instead of spacing updates months apart to manage cost exposure, the firm regained flexibility without risking budget overruns.

While downstream campaign performance metrics are still emerging, the foundational benefits are apparent:

  • Large-scale data movement without MAR-based cost spikes
  • Reduced Snowflake storage and compute overhead
  • Fewer operational constraints around when and how often data could be refreshed

Just as importantly, the solution consolidated work that had previously been split across multiple tools, reducing reliance on licensed transformation software as part of the broader data stack.

Why EASL?

  • Built by a team with 35+ years of experience moving data at massive scale for Banking & FinServ.
  • A full DevOps PaaS delivery—or utilize compartmentalized modules custom to your environment.
  • Enable your strategic resources to focus on core value-add development, or put it in their hands.

Download the file (PDF)
Success!
Oops! Something went wrong while submitting the form.
Start today

You got it. It’s time to solve your data infrastructure issues all at once

We're data geeks who love to chat with anyone who appreciates clean infrastructure and issue-free data streams.