Placement papers | Freshers Walkin | Jobs daily: Data Engineer at WeTransfer (Amsterdam, Netherlands)


Search jobs and placement papers

Data Engineer at WeTransfer (Amsterdam, Netherlands)

Every day, millions of people rely on WeTransfer to share their creative ideas. So, the work we do matters. Come and be a part of it. 


We are currently building a User Growth & Analytics team to help support our strategies and ideas with crystal clear data so that our decisions are based on our users and for the right reasons.


As a Data Engineer you will have the opportunity to build robust, scalable data pipelines that will handle billions of rows of data. You will collaborate with data scientists, product managers, and engineers by taking raw data and transforming it into actionable insights


What better way to show what it's like to work here, than to ask Gabor - our Head of User Growth and Analytics to explain what his week looks like.


Monday


Monday’s are all about starting the week off strong. The User Growth & Analytics team is responsible for the entire funnel, from brainstorming ideas to kicking off experiments. Next to number crunching, Monday's are also for our weekly brainstorm sessions where we come up with new experiment ideas.


Weekly updates are also given to the Management Team on key growth metrics to vet how we’re performing relative to our annual targets. 


Tuesday


We typically prioritise new experiments early in the week and they have a tendency to fall on Tuesday’s. We clearly hash out the problem we’re trying to solve and start to discuss how we can go about solving it in a multidisciplinary setting.


While we are jump starting new initiatives, we are also constantly adding new ideas as inspiration strikes to our idea pipeline, which houses potential experiments to be run. Some questions we consider when adding ideas to our pipeline of potential experiments are: how do we improve the virality of our offering? How can we increase the branching factor? How can we better convert receivers of transfers into active senders? 


Wednesday


When we have a set of growth experiments selected, we prepare them into our Growth Experiment Flow. This involves asking the right questions up front, analysing the data which will become the basis of our hypothesis. Since we touch a lot of the product features, we need strong alignment between all those involved, people in Design, Tech, Customer Support, and Marketing.


I really thrive on being in a design led business, surrounded by creative minds and opportunities to build our community. Our initiatives are driven and informed by the insights we collect from our experiments and its great to see our work influence the direction of our product.


Thursday


Every Thursday morning we have a company wide meeting over breakfast where we briefly highlight and discuss the latest happenings in the different teams along with a couple of key company goals that have been set for the year. For growth, we discuss how our monthly active users are trending over the past month relative to same period the year prior. We add in some commentary on some initiatives we’ve got running at the moment, a few experiments we’ve got in the pipeline, and the high level takeaways of last week’s projects.


Aside from the breakfast meeting, we make sure to keep the trains running full steam ahead. That can range anywhere from clearing blockers that are holding up certain experiments from launching to continuing to optimise our process within the team. From ideation to pushing initiatives out the door to continue to boost our experimentation velocity. 


Friday


If there’s such a thing as tying up loose ends, we’ll do our best to close the week in a tidy way so that we can pick up right where we’ve left off on Monday. We have a get-together where we take note of the latest status of existing experiments and those that have launched that week. The key events of the week are jotted down in our event log to make sure we can best attribute changes to our core growth metrics going forward.


We are looking for someone with:



  • Experience in dealing with large (>billion rows) amounts of data.

  • Advanced degree in a quantitative field.

  • Special interest in makings things robust and fault tolerant.

  • A passion to automate things.

  • Deep understanding of database design and SQL.

  • Strong programming skills (Python preferred).

  • Hands-on experience with AWS or similar (EC2, S3, Lambda, etc).

  • Hands-on experience with analytical databases (Redshift or similar).


Even better if you have



  • Experience with Airflow, Google Analytics, event data processing, Metabase (or other BI tools).


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment