Data Engineer – Terabytes & Free Catered Bites

Job Description

Are you ready to take a bigger bite out of life? Work with terabytes of data and take eager bites of free catered lunches every day in this exciting Data Engineer opportunity!

You’ll work in a company that’s made a name for themselves over more than 10 years and collaborate in an office that’s designed for both work and play. Dress in your favorite casual clothes, enjoy fun themed rooms, and form close relationships with your equally eager technologist coworkers. You’re encouraged to have fun, enjoy unlimited snacks and beverages, and use your experience as you design and build infrastructure for data extraction and data analytic tools. You’ll work with SQL/NoSQL databases and terabytes of data. Dive into a deep data pool and put your experience with hands-on scripting or programming with Ruby, Python or Java, big data technologies like Hadoop, Spark, Kafka, Cassandra, DynamoDB, and cloud technologies like AWS, Azure, and Google Cloud. You’ll also need a Bachelor’s Degree in order to be hired for this exciting position.

The easiest part of this job: If you need to relocate for this position, they’ll pay for it! Either way, you’ll get great perks like flextime and wellness classes.

The hardest part of this job: If you don’t want to work in an office with fun themed rooms and games like foosball this won’t be the right job for you.

Find terabytes of data daily and free catered bites too in this fun Data Engineer role. Apply now!

What’s in it for you?
– Competitive Salary
– Medical
– Dental
– Optical
– Life
– 401k
– Flextime
– Catered Lunch
– Paid Relocation
– 18 Days PTO
– 6 Sick Days

What you’ll be doing:
– Design and build infrastructure for data extraction and data analytic tools
– Work on hands-on scripting and programming and work with SQL/NoSQL databases
– Maintain clusters and build out data pipeline
– Optimize data processes and work with terabytes of data on a daily basis

What you need:
– Bachelor’s Degree
– Scripting with Ruby, Java or Python
– Hadoop, Spark, or Kafka
– SQL and NoSQL
– AWS, Azure, or Google Cloud
– Experience with Terabytes of Data