Building a successful tech company requires skill, dedication and perseverance, but ultimately, no startup can make it big without having the right technical foundation in place. The ability to integrate new tools into existing tech stacks influences how quickly a company can scale — and staying on top of those new technologies is critical as demand for a product grows. To get a better idea of the tools LA startups tinker with, we asked three locals about the technologies they use and how that tech helps them overcome challenges.
Founded by prolific tech entrepreneur Scott Painter, Fair’s automotive fintech app utilizes a number of technologies to personalize each user’s experience. According to Mason McLead, the company's VP of engineering, these tools were adopted strategically to allow engineers to make an impact without having to be fluent in several different coding languages.
What technologies does Fair's tech team work with?
Keeping our apps native was a priority from the very beginning. At the time, React Native was the popular option, and several successful companies invested in the technology, so it was very tempting for us. Ultimately, we felt that staying native was better for animation support, a more complete development and testing tools.
For APIs, we quickly converted to gRPC with Protocol Buffers to better define interfaces, request messages, and customize data types. This allows our mobile engineers to collaborate with backend engineers on shared specs for the API, which automatically create native clients for each system. This gives us consistency across our 100 or so services and clients.
The languages we chose have very specific purposes, as well. Ruby has a substantial community with support for almost any third-party integration you could want: for describing customer behavior, ORM support, making quick admin apps, or creating DSLs. We chose Python for our data science stack because the number of libraries and frameworks allow us to create powerful machine learning models, data munging pipelines and processing/enriching services that operate efficiently at scale.
Kubernetes has been a pivotal technology for our systems architecture. It allows any engineer to create new services, deploy them and manage them self sufficiently. With built-in secrets management, live application configuration updates, autoscaling, service discovery, load balancing, deployment strategies and so much more, we’ve been able to support our entire engineering team and our clients with only three platform engineers.
Working with a mobile-only customer experience has meant that the traditional ‘full-stack developer’ doesn’t exist for us.”
What skills are needed to work with those technologies?
On the backend, we have a very specific way of building services and a unified way of defining APIs through gRPC and Protocol Buffers. This lets everyone focus on the actual problem domain instead of having to overcome technology learning curves. In order to be able to produce new features on our platform, you need to know either Ruby, Python or Go to then get used to gRPC. After that, you’ve got all the tools you need to start shipping code to production.
For the front-end, we hold a high standard for visual excellence and user experience. Being native on both platforms means that the standard toolset is the one that is utilized. We do have a few custom architectures that we employ, but they make it easier to develop an amazing user experience.
What’s one interesting tech challenge you’re working to overcome?
Working with a mobile-only customer experience has meant that the traditional ‘full-stack developer’ doesn’t exist for us. In order to keep the front-end and backend apps working together, we’ve created a new way to define, configure and dynamically order distinct user flows. We call it AppModules and, while a work in progress, it has been a great way to encapsulate complicated logic into the backend and still allow the front-end to be dynamic based on that logic. In essence, it’s a system where the client shares what AppModules it supports and the backend dynamically guides the app through its supported experience. We can control the configuration of a particular flow based on a user assignment to a cohort additional data or a tell-tale user behavior.
Headquartered in Pasadena, adtech giant OpenX provides publishers and advertisers a suite of technologies that deliver insights across all screens. According to Distinguished Engineer Anthony Molinaro, the volume of inbound requests necessitates that his team leans on a variety of tools to handle that massive demand.
What technologies does the OpenX tech team work with?
The programming languages our systems are written in include Elixir, Erlang, Go, Java and Python, as well as a handful of others. Our data processing is done using Hadoop and tools from its toolchain. We use numerous databases, including Postgres, MySQL, Riak, HBase and Redis. All these technologies tend to communicate with each other via Thrift.
The strategic decisions we made in choosing our technologies have really helped our growth and scalability.”
What’s one interesting tech challenge you’re working to overcome?
We’re focused on scale. A solution which works for a few dozen requests a second doesn't necessarily scale to the millions of requests a second we often see. Additionally, satisfying those requests in a few fractions of a second is also incredibly challenging. Finding and fixing issues within these systems is some of the most interesting work a developer does.
What makes your stack equipped to take on those challenges?
Our stack was built by engineers with decades of experience working with these sorts of systems. The strategic decisions we made in choosing our technologies have really helped our growth and scalability. The use of Erlang and Java in our delivery systems — and the early decision to build a service-oriented architecture with Thrift — has helped us to scale. Erlang excels at delivery with low-latency responses, and Java’s ability to process efficiently via technologies like Just-In-Time compilation has helped immensely. Using a standard data processing system (Hadoop) with several hooks for customization has allowed us to continually find new solutions as we've scaled to support greater numbers of events.
As the tech arm of a century-old business, ClubLabs makes it easy for AAA members to access their accounts wherever they are. According to Senior HR Consultant Colin Tew, artificial intelligence is starting to play a bigger role as the company grows.
What technologies does the ClubLabs tech team work with?
We currently operate over 10 different teams that actively work on a number of different products and services for the larger AAA organization. We work across Web UI, iOS Apps, Android apps, business logic layers and web services. Some themes we’re quite excited about have been around web and mobile technologies and the implementation of AI — both internally externally — for the organization.
As a high-level overview, our tech stack uses Java 7/8, Objective-C, Swift, Ruby, Node js, React js, C#, .NET, Mockito, Junit Robolectric, RxJava, Gradle, Retrofit, XSL, XSLT, APIM, APIC, SQL Server, DataPower, AWS, PCF, ODM, IIB.
What makes these technologies the right fit for ClubLabs?
Our mobile apps give members easy access to all the features and benefits of their membership on a personalized dashboard. Mobile is a great way to reach our users and give them a way to interact with AAA a lot easier than physically going to a branch. Additionally, we feel AI is mandatory to preserve the company’s legacy of knowledge and expertise. We’re hopeful that AI will be a viable tool to continue providing support for our Auto Club members for the next 100 years.
We feel AI is mandatory to preserve the company’s legacy of knowledge and expertise. We’re hopeful that AI will be a viable tool to continue providing support for our Auto Club members for the next 100 years.”
What are the most interesting tech challenges you're working to overcome?
Staying ahead of the curve with new Android releases and the breadth of devices that we can support is something we’re always conscious of. Being able to keep up with Swift’s new capabilities, adapting to changes that new iOS versions provide and making our current and future systems more flexible, scalable and reliable are all things we’re working on.
The Auto Club has multiple lines of businesses, so in order to offer each user multiple services, integrating information is mandatory. AI is good at integrating large volumes of different types of data. Learning these new technologies and quickly adapting to support of our members is a crucial challenge we face.