Back

banner

Job Description:
  • PROGRAMMER ANALYST (San Diego, CA): Build end-to-end integration services using Python to onboard domain apps through an automated process on to Enterprise Data Platform. Involved in creating an automated work flow for bringing data to enterprise data lake from multiple source systems using Sqoop/Kafka & rsync. Maintain & monitor Enterprise Big Data Eco System. Build Data Pipelines using Spark Data Frames & Pandas. Write PySpark apps using Spark SQL. Build/maintain NVidia GPU based Artificial Intelligence Platform w/ Python/R components. Automate tasks using Python/Shell Scripting. Deploy Enterprise Kafka Cluster using Confluent Kafka & supported Spark Streaming apps. Deploy & maintain Oracle DataScience.com on IT managed servers. Req. MS (or equiv) in CompSci, Engg (Comp/Elec) or a rltd fld & 1 year of exp in using Python, Impala, Sqoop, SQL, Hive, Spark, HBase, Kafka & Hadoop tools. Must be willing to travel and/or relocate to work in various unanticipated locations throughout the US. Relocation benefits offered. No telecommuting. Drug testing reqd. Submit cover letter & resume to P. Baghani, Sysintelli, Inc., 9466 Black Mountain Road, Suite 200, San Diego, CA 92126.