Filum is a small but amazing tech startup that is building a Business Data Platform (BDP) that helps companies making data-driven business decisions. Interestingly, Customer Data Platform (CDP) is one of the built-in applications in our platform. Filum is headquartered in Silicon Valley with engineering and operations teams in Vietnam.


To sustain Filum growth, we are building a data-platform to orchestrate a float of services willing to handle variously federated business services with very low latency, and high reliability.

Here are some examples of our problematics and ongoing projects:

– Consolidating data files from various data sources

– Building data architecture

– Real-time data visualization

– Federating business use-case by leveraging correlation of services consumers

What we are looking for

  • Strong programming skills (mostly using Java, Python or Scala, but it’s not mandatory to have experience with this specific language)

  • Ability to design and architect a solution (functional and technical specification) before implementing it

  • A will to define and implement high-end quality/standard work: tests, docs, best-practice, maintenance, evolving-able (Strongly important)

  • Ability to architecture server or serverless solutions (ideally with Cloud services like AWS or GCP)

  • Agile team experience

  • Experience with a serverless environment

  • Strong knowledge about the Cloud serveless services AWS (Lambda, DynamoDB, SNS, SQS… ) or GCP equivalent services

  • Familiar with an Infra as a code (Terraform, Cloudformation, Ansible, serverless framework…)

  • Experience with SQL/NoSQL databases

  • Big Data experience: Hadoop eco-system, Spark, Hive, Airflow, Oozie,…

  • API development/design/architecture (Java, Scala akka, python)

  • Backend development background (any language but if you know Python, Java, Scala are a big plus)

  • Nice to have: experience with monitoring service for data pipeline processing

Where we are going

  • Modelize Data Warehouse / Data Lake fitting to general purpose while being capable to answering specific business use case: #normalizing, #relational, #orientedByObject, #de-normalizing

  • Define general/business focused schema of data object

  • Design and implement real-time/batch distributed data pipes for analytics & product: #consolidating, #synchronizing, #scheduling

  • Implement best-practices focusing on urbanization data architecture (on-premise, full cloud, hybrid)

  • Build a fully serverless, scalable and robust infrastructure allowing solution to scale smoothly to the next level

What you can expect

  • Silicon Valley-style work culture: move fast, break things, make mistakes, learn quick, and grow  
  • Extremely data-driven and process-driven in every company’s activity
  • Opportunity to learn and extend the roles within the engineering team whatever it is, like research, analytics, automated test, business development, product management, devops,… 
  • Competitive compensation package with generous stock option grants
  • Open, direct, collaborative, and no-politics working environment
  • Dealing with many challenging engineering problems to build, scale, and secure the data platform

To apply to this job, send your updated resume to (or if you know him already, give him a call 😉 )


Related Articles

Stay up to date with the latest
insightful Articles

We use cookies and third-party services to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. You can find out more here