1:35 PM (9 minutes ago)
No pay rate
Spherion, in partnership with American Family Insurance, is sourcing for a Technical Consultant. The Technical Consultant for DSAL (Data Science & Analytics Lab) will contribute to our mission through the design, development, implementation and maintenance of high performance parallel and distributed application and infrastructure systems. The position will lead the development of our data processing systems and their interfaces, which support our efforts, with a focus on enabling scalable, high performance, distributed computing environments. The Technical Consultant is responsible for implementing new data processing technologies and building prototypes. The position will contribute to the implementation of machine learning and statistical algorithms, including making them more efficient and scalable. As a key member of the Data Science team, this position will work closely with the scientists and will contribute to the modeling and data mining efforts as desired and needed.
Working hours: M-F 1st shift
Data Science Infrastructure Design, Development, and Operations (75%)
--Participate in architectural planning, design, development, deployment, and management of analytical environments capable of ingesting, processing, and analyzing large, diverse data sets
--Participates in developing holistic solution architectures ensuring that all architectural aspects of the system including data, application, infrastructure and security are addressed.
--Develops high performance parallel or distributed computing environments, as needed, and including those based on the Apache Stack (Spark, Spark Streaming, as well as Amazon Web Services (AWS) such as Kenisis, etc.
Data Science Research Support (20%)
--Participate in the definition and planning in the areas of data processing and scalable analytical and computational platforms.
--Maximize the predictability, efficiency, effectiveness, and maintainability of data science-related infrastructure elements with a focus on analytical compute environments.
--Develop means for automating data- and analytics-related systems and processes, as appropriate, to support data science activities.
Desired skills: (The following are not necessarily listed in order of priority:)
--Java, Spark, Spark Streaming, Hive, Kafka also experience or familiarity with AWS Kenisis, S3 andEC2 instances, CloudFormation, Lambda, DynamoDB.
--Desirable but not required: Tableau, NiFi, Atlassian tools
--Java 7/8, Spring Framework (MVC and Boot used heavily), SQL/MySQL/PostgreSQL, Maven, Git, Docker, Linux (CentOS and Ubuntu), Bash, Web-service/micro-service/REST architecture, AWS (CloudFormation, EC2, S3, Kinesis, Lambda, RDS, DynamoDB) or similar cloud experience (e.g. Google Cloud).
--Spark, Spark Streaming, and Hadoop Ecosystem (MR, Hive, Kafka, etc). Tools are secondary, but principles of large scale data ingest/analysis is important.
The individual should have work experience in the above skill areas.
Since 1946, Spherion has placed millions of talented individuals in rewarding administrative, clerical, light industrial, customer service, non-clinical healthcare and professional jobs. Filling a broad mix of flexible, temp-to-hire and direct hire positions, Spherion is a trusted recruiting and staffing partner to more than 3,000 companies nationwide.
Wed, 13 Sep 2017 11:57:13 PDT