Required Skills
• 3-6years experience in Hadoop stack and storage technologies, HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie
• Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
• Excellent analytical capabilities - Strong interest in algorithms
• Experienced in HBase, RDBMS, SQL, ETL and data analysis
• Experience in No SQL Technologies (ex.
, Cassandra/ MongoDB, etc )
• Experienced in scripting(Unix/Linux) and scheduling (Autosys)
• Experience with team delivery/release processes and cadence pertaining to code deployment and release
• Research oriented, motivated, pro-active, self-starter with strong technical, analytical and interpersonal skills.
• A team player with good verbal and written skills, capable of working with a team of Architects, Developers, Business/Data Analysts, QA and client stakeholders
• Versatile resource with balanced development skills and business acumen to operate at a fast and accurate speed
• Proficient understanding of distributed computing principles.
Continuously evaluate new technologies, innovate and deliver solution for business critical applications
Job Type: Contract
Pay: $65.
00 - $70.
00 per hour
Benefits:
* 401(k)
* Dental insurance
* Health insurance
Schedule:
* 10 hour shift
Ability to commute/relocate:
* Dallas, TX 75201: Reliably commute or planning to relocate before starting work (Required)
Experience:
* Hadoop: 10 years (Required)
* Big data: 9 years (Required)
* Data warehouse: 10 years (Required)
Work Location: Hybrid remote in Dallas, TX 75201