35,000 customers signing up weekly says Monzo
Monzo has hit two million customers, in a company milestone that has seen the challenger bank add 1,250,000 customers since last summer.
When Monzo published its 2018 annual report last June, it boasted 750,000 customers. Together, they had spent £2 billion with the online only bank.
Just 11 months later, it has two million customers who have spent a combined £10.7 billion with the bank, using their cards 465 million times, Monzo said today.
It’s phenomenal growth for the bank, known for its coral pink cards and unusual integrations like IFTTT, and it continues to add 35,000 customers weekly.
The bank has managed it with little apparent pressure on its largely custom-built stack. So what is Monzo infrastructure composed of? While we couldn’t reach the company’s CTO today, a combination of existing blog posts, interviews and job vacancy requirements give a pretty comprehensive idea of its current stack.
Monzo Infrastructure: What’s the Bank Built On?
Monzo, which built its own back-end, has scaled up using the open source Apache Cassandra as its transactional database, with its application code written in Go.
Apache Cassandra is a non-relational database that works fast across multiple data centers and cloud availability zones. It was originally developed at Facebook, was open sourced in 2008, and became a top-level Apache project in 2010. Cassandra is used by companies ranging from eBay to Netflix, along with CERN and Sony Playstation.( Among its biggest deployments are at Apple, where 75,000 nodes store over 10 PB of data).
Monzo has a Kubernetes-based microservices architecture, and uses Apache Kafka for its asynchronous message queue, with AWS hosting most of its infrastructure and Google BigQuery providing Business Intelligence. The bank uses AWS CloudTrail which provides API call history to enable security analysis, resource change tracking, and compliance auditing, along with the AWS CloudHSM service for its crypto keys.
Out with Linkerd, in with Envoy Proxy
Monzo infrastructure has evolved as the bank has grown. For example in April the company said it had dropped Linkerd as the tool powering its service mesh in favour of Envoy Proxy, an open source tool first developed at Lyft.
As the bank’s back-end enginer Suhail Patel put it: “Our microservices perform tens of thousands of RPC calls per second over HTTP. However, to make a reliable and fault tolerant distributed system, we need service discovery, automatic retries, error budgets, load balancing and circuit breaking”.
After Linkerd suffered latency and downtime issues as the bank scaled up, Monzo started looking at alternatives that would meet its remote procedure call (RPC) subsystem criteria. As Patel put it: “We looked at Linkerd 2.0, Istio, and Envoy. We eventually settled on Envoy because of its high performance capabilities, relative maturity, and wide adoption in large engineering teams and projects.”
He added: “Envoy doesn’t come with any understanding of Kubernetes out of the box. We wrote our own small control plane which would watch for changes in our Kubernetes infrastructure and push changes to Envoy via the Cluster Discovery Service API so it was aware of the new service.”
Growth Means Jobs…
With the expansion the company is recruiting heavily, with current roles open including four data scientist roles, 11 design roles and 17 engineering roles, along with several others. (We counted 81 jobs currently advertised).
It is also seeking an offensive security analyst and other red teamers with experience with financial services engagements and threat hunting.
Among the bank’s biggest challenges has been reducing cost-per-account. This is dominated by customer service, which as of June 2019 accounted for 66.67 percent of the operational costs for every account opened at the bank.
Loss-making Monzo has pledged to cut these figures sharply using smarter automation tools: the figure will be closely watched in its 2019 report.