Blockgram is looking for qualified candidates for a Big Data Technology Lead position with one of our clients in the Charlotte, NC area.
Job Duties and Responsibilities
Our Client’s Data Services Technology team is seeking a hands-on Big Data Technology Lead to build Markets Business Data Lake supporting their Capital Markets Sales & Trading. The incumbent will also manage the team responsible to support the Big Data platforms (Hadoop, Spark, Elastic Search, Talend) and lead development projects in the Client’s Capital Markets area, as well as collaborate and partner with Enterprise Solution Architects and Relationship Management support staff. Candidate will provide strategic thought leadership and measurable outcomes in data strategy and insight including data ingestion, data distribution, data governance, business intelligence and business analytics.
Responsibilities Include:
Manage vendors, internal relationships, Proof of Concept and Proof of Technology roadmaps and priorities
Manage and lead big data engineering team and support software development lifecycle.
Develop and implement Big Data strategic roadmap for moving the platform forward in a controlled and cost-effective way.
Work with other Enterprise teams to ensure that the Big Data team is leveraging enterprise-wide assets and support capital markets data services application systems.
Ensure infrastructure, processes and procedures meets information security and operational risk requirements.
Partner with direct and indirect technology line leaders in long-range strategic planning and ensuring integration with enterprise-wide technology strategy. This includes experience and the ability to balance all the different considerations and sensitivities with partners and influence positive strategic direction.
Establish and implement a Hadoop Capital Markets Data Lake, develop a strategy and gain consensus across key technical leadership through effective communication.
Experience, Skills, and Knowledge
10+ years of technology experience
5+ years of management experience in an IT environment
10+ years of information technology experience
5 + years of technology management experience
3+ years of Big Data experience
3+ years of data modeling experience
Desired:
BS/BA engineering degree or higher
ETL (Extract, Transform, Load) Programming experience
Knowledge and understanding of analytical methods used in: statistics or predictive modeling methods (regression modeling, clustering, pattern recognition, graphics)
Knowledge and understanding of data management concepts, process, tools, and environments
Preferred experience with the following:
Data Science: Linear Regression, Classification models like Logistic regression, CART, Random Forest, Text Analytics, Clustering, Optimization techniques
Big Data: Hive, Pig, Sqoop, Hbase, Spark, Hadoop, Kafka, Storm, Flume, Oozieo DW/BI
Architecture: Data Landing, Staging, Foundation, Consumption layer
Data Security: Row and Column Level Security, Data Encryption, Reporting and Analytics
Prior development experience with one or more of the following: Java, Perl, Python, or C+