Big Data Engineer
Schedule: 40 hour work week Monday- Friday
The main function of the Big Data Engineer is to collect, process, and analyze large sets of data.
You will be responsible for creating data and enriching data for the machine learning group.
• Work closely with the algorithm developers of the autonomous system to identify potentials for more effective use of our vast data resources for functional development, model training and system evaluation.
• Design and implement scalable data generation pipelines for machine learning projects from both simulations and real data.
Focus is on data transformation, data preparation, and extraction of valuable data sets for machine learning.
• Excellent working skills in Linux systems.
• Working experience in C++ and Python programming.
• System integration and software architecture skills.
• Experience with working in large corporations, including complex network architecture across multiple continents
• Bachelor’s degree in Computer Science, Information Technology or related field
• Minimum 2-4 years of applicable experience with Big Data
It is the policy of GCR to provide equal opportunity to all qualified applicants and employees without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran or disabled status, or genetic information. GCR is an Equal Opportunity/Affirmative Action Employer and embraces diversity in our employee population.
Job Status: Contract/Temporary