WHO WE ARE + WHAT WE DO

nFolks Data Solutions renders Information Management solutions, learning services and staffing services to clients globally, from offices located in North America, Europe, and Asia. Comprehensive approach to provide indispensable Data Solutions helps us deliver the best to our global clientele. As a market leader, we focus on Information Management solutions to align measurable business goals and objectives with information technology.

Aware of the current market demands and requirements, we work towards utmost customer satisfaction. Our pragmatic outlook has resulted in venturing into new markets and witness steady growth over the years. We have unparalleled system of- services, support, and development –which helped us in successfully completing several end-to- end implementations of IBM InfoSphere Information Management projects.

Our expertise is in Architecture, Analysis, Design, Development, Administration, Training, Maintenance, and Support using IBM InfoSphere applications such as DataStage, QualityStage, Information Analyzer, Metadata Workbench, Business Glossary, and FastTrack. We top the market in IBM InfoSphere solutions -conscious of Quality, Price, and Certified IBM Solution Developers. To build successful Data Solutions you need the best talent in tech field; consequently all our Architects, Team Leads and Senior Developers are IBM Certified Solution Developers. Our IT personnel participate in company organized education and training classes along with real time experience allows us to offer our clients an entire team of experts.

Watch Video

What we offer to our customers

Information Management Consulting Services

Our certified solution developers deliver successful Data Solutions and specialize in end to end implementation of Information Management projects. We strive not only to meet the client’s requirements but also exceed their expectation recurrently.

Know More

Training Services

We are preferred partners with training companies who provide learning services to global clients. Our presence throughout North America, Europe, and Asia aid us in achieving customer satisfaction.

Know More

Staffing Services

Conscious of the client's demands; we aim to nurture their requirements with our staffing services in order to provide high-caliber teams and IT professionals.

Know More

THE BEST BRANDS TRUST US

WE HELP OUR CLIENTS STAY AHEAD OF THE COMPETITION

We work with a wide range of clients, from top organizations to medium and small scale businesses.

OUR PEOPLE MAKE US GREAT

WHAT OUR CUSTOMERS SAY

format_quote
format_quote

nFolks Case Studies

Client: Louisiana Department of Corrections
Project: DB2 Migration from Legacy Systems

Louisiana Department of Corrections implemented the DB2 migration by creating the DB2 Data Store as their first IT enterprise modernization. The department planned to extract the data from their existing Mainframe Mapper and Notes legacy databases and move the data to DB2 database as the basis for the enterprise “database of record” after going through the compete cycle of profiling, cleansing and transforming the legacy data for web-OLTP using IBM Information Server Suite of products as well as query and reporting using IBM Cognos BI tool.

The issue for the client was extracting the data from legacy hand written very old databases such as Mapper and Notes for different database objects/tables & applying rules on top of it to load data into DB2 Database. All this was targeted to be completed in limited budget and fixed timeline. In addition, the Information Server was deployed on Windows Operating system with limited resources sitting on a VMWare for which support model from vendors was not existing. As these were legacy databases parent/child relationship between tables was also not present. As a result this check needs had to be made in ETL code and records which did not meet the requirement had to be captured and emailed to Data Owners.

nFolks Data Solutions took this as a challenge and used IBM Information Server suite of tools provided by client by coming up with different solutions such as using the correct ODBC drivers as extracting data from legacy database was not easy with limited drivers supporting it partially. nFolks used a combination of Parallel and Server jobs as it was creating lot of processes on windows server & parallel extract or huge extract would break the connection with legacy databases Scripts were written to monitor the hanged processes & clean them frequently as the no of resources are limited. The data was also Unicode and special characters had to be handled carefully so that data retains its original form and loaded into DB2 database.

Following the implementation, the department made the process seamless. Programs & scripts written by nFolks Data Solutions made the process seamless and in time delivery of project within limited budget by addressing all issues which made the migration of legacy databases to DB2 database successful.

Client: Major Financial Institution in the US
Project: Upgrade of ProfitMax NextGen Solution

ProfitMax NextGen objective was to calculate Profit and Loss metrics for the wholesale line of businesses at various levels like Customer, Relationship, Officer, Accounting Unit (AU), and Organizational Unit (OU) by building a Profitability Data Warehouse. The ProfitMAX NextGen is a rewrite of current legacy application, which is a customer and product profitability tool used by the Wholesale Banking division of the client to view customer / product profitability. 3000 users were estimated to access the solutions and would enable them to view and measure profitability at an officer, AU and product level.

The issue for the client was that the Data Warehouse modeling was done with very poor practices & no best practices had been in place by previous implementation partner, leading to large number of issues, resulting in inaccurate data with missing relationships between target tables. The issues resulted in an unstable code and the reporting results were affected.

nFolks Data Solutions, along with another implementation partner, were requested to fix the issues caused by the previous poor implementation. With best Solution Developers and Architects being deployed with right strategy, we were able to fix the code issues as well as the design issues & developed a solution which resulted in generating reports at 20 levels down granular reports across the whole sale line of Business Services and products for the financial institution, thereby meeting the primary end-point requirements. With the implementation, client was able to save lot of money and got and effective solution, which resulted in generating reports granular reports for 20 different at 20 levels, thereby providing streamlined data for improved decision making.

Case 3: Automotive Distributor in the US
Project: Migration from DataStage 7.0 to DataStage 8.7

Client is a distributor for a large Japanese automotive company in the US. Client operates with 150+ dealerships in 5 states. Data Migration project was one of the high priority projects for the client wherein they were migrating their systems from IBM InfoSphere DataStage 7.0 version to IBM InfoSphere 8.7.

Client wanted to migrate the code from DataStage 7.0 version to DataStage 8.7 version but most of the jobs were server jobs with few plug-ins being used which were no longer available in DataStage 8.7 version and these needed to be converted to parallel jobs. Overall Architecture assessment was based on different factors such as volume assessment, server assessment, topology design, resource estimation, performance tuning before installing the DataStage 8.7 version and migrating the code from lower version to higher version.

nFolks Data Solutions performed server assessment and upgraded the code from DataStage 7.0 version to DataStage 8.7 version by converting all the server jobs to parallel jobs whose plugins were depreciated. The migration was successful with minimal disruptions to the operations of the client. The implementation helped the client achieve Business Continuity by staying abreast of the latest data technologies.

Case 4: Major Airline in the US
Project: Cleanroom Project for Sales Data Analysis

Cleanroom project was a sales merger project between two major US based airlines, which were in the process of merging their operations, including sales data. The target for the implementation was to generate market shares of the two airlines with reference other airlines for improving sales.

To major airlines in the US decided to merge. Prior to the completion of the merger, the airline companies were not allowed to run combined sales reports, and the same was outsourced to ZS Associates where nFolks Data Solutions was asked to help on the ETL part for loading sales data across different routes, sources, destinations, legs, seats and others compared to other competing airlines.

nFolks Data Solutions used various data tools to assimilate data from the two airlines while adhering to legal requirements for data privacy prior to the merger. Following the solution provided y nFolks, client was able to seamlessly generate market share reports for improved business planning for activities such as route optimization and sales planning.

Case 5: Major Power Utility in the US
Project: Smart Connect Data Warehouse Project for Consumer Generated Data

Smart Connect Data Warehouse project is one of the high priority projects for client connecting large number of data points from their customer meters. The Metadata Warehouse repository is one of the important modules of this project being implemented to generate different reports related to impact analysis, data lineage and Business lineage.

The client, as part of Smart Connect project, wanted to implement the Metadata Warehouse repository across systems to find the impact analysis for any field changes, find data lineage and business lineage using IBM Business Glossary, Metadata workbench and other IBM foundation tools. The challenge was to integrate the solution with the large number of lot of end-to-end systems including vendor technologies such as SAP Business Objects reports, Erwin data models, Teradata, Oracle, DataStage code, stored procedures, extended mappings and others.

Representing IBM at client site, nFolks Data Solutions implemented the project by importing all the metadata assets using IMAM for Erwin data models, Teradata, Oracle, DataStage ETL jobs and others. nFolks also created Business Categories, terms and other Business assets using Business Glossary, and mapped the IT assets to Business assets in Business Glossary to generate data lineage, business lineage & impact analysis using Metadata Workbench. Worked closely with IBM Israel Labs to address the product issues, we offered help in including right features which would provide more granular information for the client.

Client was able to find impact analysis if any IT asset such database column or ETL jobs, generate end to end lineage from source system to SAP BO reports, generate business lineage. In addition, both function & technical teams are able to use Business Glossary.

Case 6: Major Retail Company in the US
Project: Data Migration from Disparate Systems to SAP ECC 6.5 System

Client was a holding company for popular retails clothing and accessories brands in the US, and was merging the brands under a single entity for better business management and decision making. The merger process was to be completed in two phases and all the major brands under the portfolio were being migrated to upgraded systems.

Client wanted to merge all the 8 brands in the portfolio in two phases by migrating data from different sources such as SAP R/3 4.7, files and DB2 to SAP ECC 6.5 so that these brands are managed from one SAP ECC system of record.

Representing IBM at client site, nFolks Data Solutions implemented the project by creating the DataStage jobs with best practices and performance tuning steps to merge the data for different brands from SAP R/3 4.7 System to SAP ECC 6.5 system. Following the implementation, client was able to access all the brands data from one SAP ECC 6.5 system as opposed to individual systems or different system of records for different brands

Case 7: City of New York HRA
Project: DataStage Upgrade to DataStage 11.3

City of New York HRA was upgrading their infrastructure from older versions of DataStage to DataStage 11.3 which was implemented by IBM and as a part of the project, which was a very short engagement, worked closely with NYC HRA to Install and configure IBM InfoSphere Information Server 11.3.

City of New York HRA was upgrading their infrastructure from older versions of DataStage to DataStage 11.3 which was implemented by IBM and as a part of the project, which was a very short engagement, worked closely with NYC HRA to Install and configure IBM InfoSphere Information Server 11.3.

Case 8: Major Financial Institution in Australia
Project: Implementation of IBM MDM v11.6

Implementation of IBM MDM v11.6, DSC, RESTful, Bamboo, Maven & Oracle 12c, Kafka for major financial institution in Australia

The implementation consisted of two parts: The Global Registry (TGR) and Australian Operational Customer Master (AOCM).

The Global Registry (TGR) was a combination of virtual MDM (Matching Hub - MH) and physical MDM (Operational Hub - OH). MH provides matching entity link based on outcome of algorithm defined with probabilistic matching, while OH persists with the member data for enriching additional data such as hierarchy and relationships. Data changes for every member are notified to downstream systems by triggering notification messages. nFolks designed and developed 'maintain party service‘, 'publish derived record’, 'data model' and 'algorithm' to accommodate entity attributes. Configured Bamboo jobs for automating build, deploy and test processes.

Australian Operational Customer Master (AOCM):

AOCM is a master slave database to consolidate customer data from multiple data sources and to serve the required data to downstream applications or systems. Java Server Faces web framework-based Data stewardship application is customized to integrate the new attributes, and used for searching customers based on user-defined criteria. AOCM is capable of emitting event notifications for changes to 'party profile' and 'customer-to-account relationship' to downstream systems.

On behalf of IBM, nFolks designed and developed standalone Java application – 'Missing Data Remediation' for remediating missing data in AOCM or in source systems, and also migrated source control repository to GIT and validated the generated EAR works. In addition, nFolks also upgraded AOCM from MDMv10.5 to MDMv11.5, and included Oracle migration from 11g to 12c. By ensuring that AOCM MDM migration did not affect functionalities, customer data and web services, nFolks ensured business continuity for the client with minimal disruptions.

WORK WITH US AND GROW YOUR CAREER

  • add_circle_outline
    DataStage Admin
    Salary No Constraint 1 positions Noida
    Skills Required
    6+ Years of Relevant Experience in Data Stage Administration
    Mode & year
    Contract To Hire
    Location
    Noida
  • add_circle_outline
    PL/SQL Developer
    INR 6.5 lacs 2 positions Hyderabad/Bangalore/Kolkata
    Skills Required
    6+ years of Relevant experience in Pl/SQL development with ETL Experience
    Mode & year
    Contract To Hire
    Location
    Hyderabad/Bangalore/Kolkata
  • add_circle_outline
    DB2 Admin
    Salary No Constraint 1 positions Noida
    Skills Required
    6+ yrs of Relevant Experience in DB2 Administration
    Mode & year
    Contract To Hire
    Location
    Noida
  • add_circle_outline
    Qlick View Developer
    Salary No Constraint 2 positions Pune
    Skills Required
    4+ years of Experience in Qlick View and Qlick Sense
    Mode & year
    Contract To Hire
    Location
    Pune
  • add_circle_outline
    Informatica Developer
    INR 11 Lacs 6 positions Hyderabad
    Skills Required
    6+Years of Informatics Development
    Mode & year
    Contract To Hire
    Location
    Hyderabad
  • add_circle_outline
    DataStage Developer
    INR 13 Lacs 6 positions Hyderabad
    Skills Required
    6+years of Relevant Experience in DataStage with TeraData
    Mode & year
    Contract To Hire
    Location
    Hyderabad
  • add_circle_outline
    DB2 Developer
    INR 15 Lacs 6 positions Hyderabad
    Skills Required
    6+ Years Relevant experience in DB2 Development
    Mode & year
    Contract To Hire
    Location
    Hyderabad
  • add_circle_outline
    Java Developer
    1 positions Grand Rapids, MI
    Skills Required
    Bachelor's or Master's Degree in Computer Science or Information Systems Five to seven years previous software development experience or an equivalent combination of education, training, and experience Ability to rapidly learn and apply new technologies Proficient with the Java programming language and web application development (J2EE, J2SE, Java, AJAX, JSP, JSF, JMS, Struts, JPA, Hibernate, Spring) Working knowledge of HTML, CSS and JavaScript Solid experience with application servers such as WebLogic 12.2 Oracle 19c, SAP Focus on writing quality software, utilizing relevant testing tools like J-unit & mock testing using EasyMock or Mockito Focus on using quality practices such as Test Driven Development Familiar with build/deploy processes and tools such as Ant and Maven Experience in Agile-based development Strong analytically and trouble shooting skills Solid team player, innovative, and creative Strong verbal and writing skills Effective time management, organization, and communication skills
    Mode & year
    Telephonic
    Location
    Grand Rapids, MI
  • add_circle_outline
    SAP MDG Consultant
    1 positions Baton Rouge, LA
    Skills Required
    Experience Required: Lead at least one master data domain (Customer, Materials, Vendor) work stream teams to gather information and execute master data needs • 5+ years of experience in data management. • Help define and develop standards, guidelines, Data processes • Ability to understand and address Master Data Quality Issues • Collaborate with Program management and master data management team • Previous work experience may include working tools like SAP MDG, IBM Infosphere MDM, Informatica MDM • Architect MDM and implement MDM solutions and ability to designing custom solutions and processes in MDM to help extend the functionality • Ability to integrate using Web Services • Data Modeling Experience preferred along with Good understanding of data warehousing and data integration concepts • Experience developing conceptual and physical solution architectures Successful candidates for these positions will work onsite at the IBM Client Innovation Center in Baton Rouge. The IBM Client Innovation Center is an in-bound delivery model where we support our clients from our Baton Rouge center. You are expected to travel up to 50% of the time. This is a traditional office position. You must live in, or be willing to relocate to, Louisiana. The work location is 100 North Street Baton Rouge, LA 70802. This is not a work from home position. Required Technical and Professional Expertise • Minimum 5 years of experience utilizing SAP technologies • Minimum 5 years of experience in Development/ Configuration/solutions evaluation/ Validation and deployment • Strong knowledge of SAP FI/CO, MM, SD or PM. Master Data, required • Minimum 3 years of experience of providing technical leadership, coaching, and mentoring to team members with proven success on multiple enterprise-wide software development projects with support and delivery in a production support role in a medium to large business. • Minimum 3 years of experience in Master data management with expertise in implementing SAP MDG
    Mode & year
    Telephonic
    Location
    Baton Rouge, LA
  • add_circle_outline
    Data Engineer with Azure
    DOE 1 positions Remote/Philadelphia, PA
    Skills Required
    Data Engineer with Azure Work location – Remote for this year and Philadelphia, PA post the current situation Long Term Skills and Qualifications: • Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required • 1+ year experience working with Azure analytical stack • Deep experience with building analytical solutions on Azure SQL DW (or Azure Synapse) • Strong knowledge in Cost Allocations, Pricing and Profitability analysis is required for this position • Experience with Delta Lake is a nice to have but experience with Azure Data Lake is required • Experience with Databricks is a nice to have • Experience building high-performance, and scalable distributed systems • Continuous Data Movement/ Streaming/ Messaging: • Experience with related technologies ex Spark streaming or other message brokers like MQ is a PLUS • 3+ years’ experience developing, deploying and supporting scalable and high-performance data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing) • 3+ years’ experience in a software engineering, leveraging Java, Python, Scala, etc. Thanks arun@nfolksdata.com
    Mode & year
    Phone
    Location
    Remote/Philadelphia, PA
  • add_circle_outline
    Data Engineer (AWS, Snowflake, Batch ETL tool):
    DOE 1 positions San Antonio TX
    Skills Required
    Data Engineer (AWS, Snowflake, Batch ETL tool): Location: Remote (flexible)/TX after covid Long Term Skills and Qualifications: • Data Engineer with Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required • 1+ year experience with Snowflake database • AWS cloud experience (EC2, S3, Lambda, EMR, RDS, Redshift) • Experience in ETL and ELT workflow management • Familiarity with AWS Data and Analytics technologies such as Glue, Athena, Spectrum, Data Pipeline • Experience building internal cloud to cloud integrations is ideal • Experience with streaming related technologies ex Spark streaming or other message brokers like Kafka is a plus • 3+ years of Data Management Experience • 3+ years of batch ETL tool experience (DataStage / Informatica / Talend) • 3+ years’ experience developing, deploying and supporting scalable and high-performance data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing) • 2+ years’ experience with Hadoop Ecosystem (HDFS/S3, Hive, Spark) • 2+ years’ experience in a software engineering, leveraging Java, Python, Scala, etc. • 2+ years’ advanced distributed schema and SQL development skills including partitioning for performance of ingestion and consumption patterns • 2+ years’ experience with distributed NoSQL databases (Apache Cassandra, Graph databases, Document Store databases) • Experience in the financial services, banking and/ or Insurance industries is a nice to have Thanks arun@nfolksdata.com
    Mode & year
    Phone
    Location
    San Antonio TX
  • add_circle_outline
    System Engineer
    INR 11 positions Bangalore
    Skills Required
    Python Programming
    Ansible
    Linux
    Messaging Queue
    Docker & Kubernetes
    Job Description
    • Experience running RHEL infrastructure
    • Experience working with Ansible
    • Experience with Message Queues (Kafka preferred)
    • Experience with PostgreSQL Databases
    • Experience with Redis Caching
    • Experience with Docker deployments (Docker Swarm, Kubernetes, Openshift)
    • Experience with a version control system (GIT preferred)
    Mode & year
    Contract o Hire
    Location
    Bangalore
  • add_circle_outline
    Site Reliability Engineer
    INR 6 positions Bangalore
    Skills Required
    Scripting : Perl, Python,
    Repository : GITHUB
    AWS
    Ansible/CHEF
    SPLUNK, ELK
    Job Description
    Required Skills:
    · Minimum of 5 years’ experience in hands-on production administration of large system environment
    · Experience in establishing, following, and improving upon procedures within a mission critical environment
    · Must be efficient in writing scripts
    · Must be extremely comfortable using and navigating within a Linux environment
    · Musts have the ability to do high level debugging and problem analysis by examining logs and running Unix commands
    · 2+ years Experience with github, perl and python
    · Excellent written and verbal communication skills
    · Comfortable operating in fast paced environment
    · Understands how DNS works
    · 4+ years or more experience in virtualization environments such as AWS / Softlayer/Zen/VMWARE
    · 2+ year’s Experience with configuration management systems (SALT/Ansible / Chef)
    · 2+ year’s Experience using splunk and or ELK
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Automation Test Engineer
    INR 4 positions Bangalore
    Skills Required
    Python
    Shell Scripting
    CI/CD
    Docker/Kubernetes
    Cloud
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Golang Developer
    INR 3 positions Bangalore
    Skills Required
    goLang (Mandatory)
    Microservices Development and understanding of it (Mandatory)
    Expert in Kubernetes (Mandatory)
    Knowledge of IaaS, Dockers, Helm
    Expert in Implement REST/HTTP/RAML/Swagger
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Control Plane Infrastructure
    INR 1 positions Bangalore
    Skills Required
    Required Technical and Professional Expertise
    A track record of building enterprise systems
    Strong debug skills, effective verbal and written communication skills, team oriented Minimum of 2 years’ experience with Kubernetes deployments and Kubernetes administration
    Minimum of 5 years’ experience programming using GoLang, Python, or C++.
    Minimum of 5 years' experience with Linux operating systems
    Minimum of 5 years’ experience with Agile team project delivery practices
    Preferred Technical and Professional Experience
    2 years' experience with large scale software system deployments 5 years’ experience using Container management technology such as Kubernetes and Docker Experience with IBM Cloud Platform
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Site Security Engineer
    INR 3 positions Bangalore
    Skills Required
    Areas of Focus:
    Process Development and Orchestration
    Integration with existing Business systems and technical platforms to operationalize security and compliance gaps.
    Exposure to IT Security Management processes and procedures
    Network knowledge regarding ACL and VLAN configuration management
    Enterprise Password Management solutions
    Orchestrating solutions to address security control gaps
    Relevant Work Experience:
    At least 2 years experience with Cloud based solutions
    Experience developing, implementing, and operating large-scale IaaS, ultra-highly available and highly secure cloud environments/services
    Supporting application deployments, building new systems and upgrading and patching existing ones.
    Operating the cloud infrastructure and services within our security and privacy guidelines and compliance needs
    Implementing logging, auditability, security, and monitoring features for cloud services and infrastructure assets.
    Minimum 3-5 years in systems administration/Software Engineering/DevOps, networking in a large environment.
    Computing: Strong programming experience C, C++, Java, Shell, Perl, GO, Ruby, PowerShell, ASP.net v4+ or Python
    Specific Knowledge: Software Engineering (GIT, Jenkins), Networking (Protocols, Load balancing, troubleshooting), deployment & configuration management (Chef, Puppet, SaltStack, Ansible, or NPM), Linux (RHEL,SLES), Containers (Docker, LXC), Monitoring (Kibana, Elastic Search), Cloud systems and Virtualization (CloudStack, OpenStack, AWS, EC2, Xen, KVM, or OpenStack),
    Database (Oracle,MySQl,MariaDB,Cassandra,S3,HBase,Hadoop,MongoDB, or CouchBase), Security tools (Nessus, Vault)
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Test Developer
    INR 3 positions Bangalore
    Skills Required
    Python Programming
    Django
    Mongodb
    HTML
    CSS
    Java Script
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Sr. Oracle Developer with Informatica
    1 positions Minneapolis
    Skills Required
    We have an urgent requirement of a very strong SQL resource. Full JD as below Strong experience with Oracle Sql and plsql (9.x, 10.x, 11.x) as developer Experience on ETL with Informatica. Experience with UNIX shell scripting. Excellent Communication skills (oral and written).Experience in onsite/offshore model. Good to Have- Experience working in Tivoli work scheduler is preferable. Experience in conversion projects is preferable. Experience in working with SAP modules (ECC, CRM or GTS) as source or target is preferable. Develop good quality code with minimum maintenance. Strong architect (design) and technical skills. Ability to translate business needs into technical solutions. Excellent analytical and problem solving skills. Should work with multiple teams including BI, Data cleansing etc as needed. Should have good documentation skills including creating SOP, conversion execution document, and technical design document as needed.
    Mode & year
    Telephonic
    Location
    Minneapolis
  • add_circle_outline
    DataStage Developer
    1 positions MN or GR-Remote
    Skills Required
    Need Cloud ETL tool Talend highly Our client distributes grocery products to independent and chain retailers in 50 states and our own corporate-owned retail stores throughout the Midwest, in addition to fresh food processing and distribution. Through our MDV military division, we are a leading distributor of grocery products to U.S. military commissaries. 5 years' experience DataStage V9.1 and 11.7 Must haves : DataStage conversion from 8 to 9 & 11 - converting all systems : Experience working w/ Interfaces : Prefer someone to work in MN : Remote could be an option if strong comm skills : Hierarchical stage in DataStage is used to parse or compose XML (Extensible Markup Language) and JSON data. This stage was introduced in Version 11.3. When we have huge amounts of data to work with, then Hierarchical stage is preferred over XML packs : Strong verbal and written communication skills; ability to communicate IT programming In a non-technical manner. : Must have strong organizational, prioritization, analytical and problem solving skills. : Must be detail oriented and have strong project/time management and research skills. Working knowledge of MS Office, MS Project and Visio. : Must have strong knowledge of Datastage (ETL tool) with background In development of Datastage solutions. : Strong relational database and SQL skills required, with MS SQL/Server preferred. An understanding of DB2 or lnformix is a plus. : Experience working with Transportation application software a strong plus, with knowledge of routing (JDA/Manugistics) and on board computing (Cadec) applications preferred.
    Mode & year
    Telephonic
    Location
    MN or GR-Remote
  • add_circle_outline
    Lead Cloud Engineer (Azure
    1 positions Remote
    Skills Required
    Our client is a nationally recognized not-for-profit health system, offering a full continuum of health services through their health plan, medical group and hospital group. Our client is looking for a Lead Cloud Engineer (Azure) Strong chance to extend, if someone is local we could look for full time hire in the future. Remote work, travel NOT expected. No Subs This person will help us drive the future state of Cloud for the organization, must be a very strong candidate. • Ability to administer and manage resources and identities in Azure cloud platform • Ability in dev ops engineering for cloud resources and data science applications • Ability to set up and troubleshoot data science development toolkits such as Git, Airflow, Docker and PyCharm • Proficiency in the following languages: Python, Shell Scripting, SQL and Spark • Familiarity with data architecture, engineering, and governance concepts • Familiarity with machine learning concepts Qualifications – Required • 7+ years designing and administrating cloud infrastructure (Azure required) • 5+ years providing dev ops support for cloud platforms and data science applications Qualifications – Preferred • Experience in building big data pipelines using Spark • Experience in ETL development, data warehousing • Experience in machine learning • Experience in Airflow and Docker • Experience in healthcare industry
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    Sr. AEM Developer
    1 positions Grand Rapids, MI
    Skills Required
    Key Responsibilities: • Designing and developing web applications using the Adobe platform, including guidance of site structure, components, templates, workflows, dialogs, object model designs (Java APIs), and unit testing using AEM architecture (CRX, OSGI, JCR) • Setup and configure AEM authoring, publish, and dispatcher environment with Adobe recommended best practices. • Integrate AEM with other marketing products like Assets, Target, Campaign, and other internal endpoints. • Work closely with Vendor partner to ensure sound practices regarding site architecture, performance and reliability, and content delivery are in place. • Work in SAFe agile development Methodology. • Work with front-end technologies and frameworks. • Follow best practices for secure web programming and deployment. Adhere to internal best practices with respect to coding standards, unit test coverage, automation, and continuous integration. • Develop alongside an Offshore team of AEM Developers Required Skills & Experience • Adobe AEM Developer with strong Java/J2EE background in both front-end web design (React js) and AEM integration. • 3+ years of AEM 6.x/CQ5 experience • Sound understanding of all AEM building blocks including templates, components, dialogs, widgets, social components etc., code build and deployment process • 2+ years of UI development experience with ES6 Java Script and CSS preprocessors(LESS, SASS). • 2+ years of strong web content management experience with Adobe AEM/CQ5. • 3+ years of Java development and familiarity of frameworks such as OSGi • Experience developing reusable AEM components for authoring content, reusable code libraries, unit testing, automation and code walkthroughs • The ability to present technical concepts to technical and non-technical internal/external stakeholders • Ability to write clean, modular, reusable code (using design patterns) and experience with unit-test driven approach to development
    Mode & year
    Telephonic
    Location
    Grand Rapids, MI
  • add_circle_outline
    OFSAA Data Architecture
    2 positions San Antonio TX
    Skills Required
    Required: • Proficient in OFSAA Data Architecture • Knowledge of Data Modeling using Erwin tool, Data Migration Activities and Slowly Changing Dimension (SCD) Component • Extensive knowledge in creating Oracle Packages, Procedures, Functions, Views, Triggers and Queries using Oracle SQL-PL/SQL • Core Java knowledge • Database design, Performance tuning, development and integration using Oracle • UI experience with OFSAA forms framework, HTML, JSP and CSS3 • Experience with jQuery and Angular JS • Experience with WebSphere or Tomcat • Experience with Maven and Jenkins • Experience with agile project methodology • Understanding of OFSAA Reconciliation Framework • OFSAA metadata management framework (Hierarchies, datasets)
    Mode & year
    Telephonic
    Location
    San Antonio TX
  • add_circle_outline
    Adobe Developer
    1 positions Remote
    Skills Required
    Our client, a family owned Midwestern grocery/retailer that strives to better the lives of people in all communities is in need of an Adobe Developer. Adobe Developer (all Marketing Cloud products) • HTML,XML,CSS and Javascript • Experience in developing work flows, content blocks and custom resources in Adobe campaign • Proficiency in SQL and ETL development • Knowledge of AEM, Target and Audience manager products and their integration with Adobe campaign • Designing solutions and writing technical documentation for a commercial website
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    Sr. Java Developer
    1 positions Dallas TX
    Skills Required
    We are looking for candidates that have previous airline experience and preferably are local to Dallas. They just need to be very strong in Java and communication plus good job tenure. Airline experience is required. Sr. Java Developer Dallas TX Long Term Top Skills: • Java • Spring • Akka Description The mission of the Sr. Java Developer is to develop and deliver software in the Operational Data Store, API’s, and other development work within the Aircraft MX organization. This is a large development team that will do some paired programming in an open SAFe development Environment. This team also partners and with the Development teams within the train to deploy these applications into AWS What you will do: • Analyze, Design, program applications, API’s, and integration code as it pertains to the MX applications, and the ODS product • Implement functionality within a well-integrated application system in accordance to the partner requirements, organizational methodologies, and standards • Partner with the Product Owner and SCM to ensure the work package is delivered on time and on scope. • Respond production problems and implement immediate resolution efforts to the team and dev Teams • Mentor less-experience developers on this team and other teams• Explore and provide feedback for various technologies Must Haves: • 6+ years of designing, developing, and implementing Java-based software in an Agile team environment • Experience consuming and creating Restful and Micro-Web Services.• Experience developing and deploying code into AWS using Cloud Native tools and processes .• Experience with XML, XSLT, and messaging • Experience with Jenkins
    Mode & year
    Telephonic
    Location
    Dallas TX
  • add_circle_outline
    Test Lead
    1 positions Remote
    Skills Required
    Test Lead Remote (Some travel to Denver, CO on need basis) Long term Contract For this role, telecom and wireless network roll-out experience are key, so generic test manager candidates will not be a good fit. If candidates do not have telecom and network experience, they will not be considered. The test lead must have the following skills: • 5+ years of experience in the telecommunications industry • Experienced developing and designing test strategies and test plans and can drive overall testing execution with client • Experience executing and driving Unit, Integration, and User Acceptance Testing and is experienced assisting and demoing with clients • Prior experience with a 4G/5G network roll-out in a cloud-native environment
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    SDET
    1 positions Dallas TX
    Skills Required
    Top Skills Details 1. Design and develop test automation scripts in C# (Strong C# Background) 2. Experience with Report Testing and/or Data Warehouse environment (working with ETL jobs, SQL Server, SSRS) 3. SQL - Understand what developers are doing and following the logic Additional Skills & Qualifications Key question to ask when checking reference is how did they do in making a decision on what to test on the front end vs. not testing. They want someone that can make this decision, not be just told what to do. The most important trait they're looking for is someone that is passionate about learning more about their trade. They want people who curious and push for improvement. Communication skills will be important in this position as well. • Execute a comprehensive automated regression test strategy • Report errors/failures to internal and external parties when needed Reduce the # of defects found by customers and business users in production • Develop and recommend tools to improve quality of software and processes • Strong understanding of Solid Development Principles – BDD, OOP • Experience in an Agile or SAFE scrum environment • API Testing
    Mode & year
    Telephonic
    Location
    Dallas TX
  • add_circle_outline
    Sr. Informatica Developer
    1 positions Minneapolis MN
    Skills Required
    focusing on Informatica ( power center) development as primary skill. Candidate has to have excellent SQL knowledge. Working experience in SAP data migration project will be added advantage Strong experience with Oracle Sql and plsql (9.x, 10.x, 11.x) as developer Experience on ETL with Informatica. Experience with UNIX shell scripting. Excellent Communication skills (oral and written).Experience in onsite/offshore model.
    Mode & year
    Telephonic
    Location
    Minneapolis MN
  • add_circle_outline
    Senior Consultant
    1 positions Seattle WA
    Skills Required
    Experience with Integrating TealiumIQ and Eventstream into experience Heavy Data Stitching and mining experience. Ability to provide tech guidance
    Mode & year
    Telephonic
    Location
    Seattle WA
  • add_circle_outline
    Hortonworks Administrator
    1 positions Austin TX
    Skills Required
    Please find below a new requirement that require Hortonworks. Seeking a talented experienced Hortonworks Administrator for Austin, TX location Duration: Long Term Work location would be Austin TX, though it would be a remote currently PURPOSE OF JOB The Hortonworks Administrator provides day to day support and administration of the Hadoop infrastructure and associated ecosystem tools as well as working to enhance and improve the infrastructure. JOB DUTIES • Responsible for implementation and ongoing administration of Hadoop infrastructure • Installation and configurations of patches and version upgrades • Cluster monitoring and troubleshooting • Performance tuning and cluster availability management • Resource management and monitoring • Design and maintenance of security models • Support Hadoop development community as subject matter expert • Work closely with architecture, network, server, storage and application teams Minimum Requirements • Bachelor's degree or 4+ years of work experience in the Information Technology field • 4+ years of system administration experience in a Unix/Linux environment • 2+ years experience in Hortonworks administration, Apache Hadoop and Hive. • Experience with Hbase. • Experience with Pig. • Experience with HortonWorks
    Mode & year
    Telephonic
    Location
    Austin TX
  • add_circle_outline
    PIM Solutions Architect
    1 positions Jersey City, NJ
    Skills Required
    Responsibilities • 10 – 15 years of experience in Architecting data hub, BI and data integration • Specifically, experience in PIM / Supply chain solutions • Provide architectural solutions/designs. • Provide architectural assessments, strategies, and roadmaps for one or more technology domains • Develop Proof-of-Concept project to validate architecture and solution • Drive common vision, practices and capabilities across DVC tracks • Engage with business stakeholders to understand required capabilities, integrating business knowledge with technical solutions • Ability to determine the most appropriate technical strategy and designs to meet business needs Work Planned • Ability to translate the Data hub vision into solution options • Highlight the capabilities required and gaps in the current landscape • Ability to drive flexible and scalable approach for the solution • Holistically look at the data solutions required for reporting and integration across product, supplier and supply-demand track
    Mode & year
    Telephonic
    Location
    Jersey City, NJ
  • add_circle_outline
    full stack java developers
    1 positions Michigan
    Skills Required
    for TRUE FULL STACK java developers with significant recent experience with ANGULAR 5 or higher for at least the past 2 years. These are senior level positions and I cannot stress enough how important RECENT SIGNIFICANT EXPERIENCE WITH ANGULAR 5 OR HIGHER IS.
    Mode & year
    Telephonic
    Location
    Michigan
  • add_circle_outline
    C# SDET
    1 positions Rhode Island
    Skills Required
    It is a long term contract with a client in Rhode Island for a C# SDET. The position will be remote initially, but candidates will be required to relocate to Rhode Island and work on site once employees return to the office. The client will hire off of a WebEx interview. I Need: - C# SDET (MUST be C# focused background, MUST be an SDET) - C# development experience - Experience creating APIs - Experience creating automation frameworks
    Mode & year
    Telephonic
    Location
    Rhode Island
  • add_circle_outline
    Salesforce Consultant
    1 positions Remote
    Skills Required
    Hi, I have a long term (1 year+) assignment in Boston for a Salesforce Consultant, initially it is remote but it will need to relocate and work on site once employees return to the office. This needs to be someone who used to be a Salesforce Developer, and has shifted to a BA/PM/functional type role for the past few years. Please only send candidates who used to be Salesforce Developers but who have made the shift to a functional role. DO NOT send just salesforce developers, or just salesforce BAs… it needs to specifically be someone who was a salesforce developer but has moved into a functional/business facing position the past few years.
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    Data Architect - Snowflake
    1 positions Plano, TX
    Skills Required
    Job Summary: As part of Data Engineering team, you will be architecting and delivering highly scalable, high performance data integration and transformation platforms. The solutions you will work on will include cloud, hybrid and legacy environments that will require a broad and deep stack of data engineering skills. Must have leadership skills and should be able to guide and mentor team. You will be using core cloud data warehouse tools, hadoop, spark, events streaming platforms and other data management related technologies. You will also engage in requirements and solution concept development, requiring strong analytic and communication skills. Skills and Qualifications: • Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required • Experience building high-performance, and scalable distributed systems • Strong experience with Snowflake database • AWS cloud experience (EC2, S3, Lambda, EMR, RDS, Redshift) • Experience in ETL and ELT workflow management • Familiarity with AWS Data and Analytics technologies such as Glue, Athena, Spectrum, Data Pipeline • Experience building internal cloud to cloud integrations is ideal • Experience with streaming related technologies ex Spark streaming or other message brokers like Kafka is a plus • Data Management Experience • Batch ETL tool experience (DataStage / Informatica / Talend) • Experience in developing, deploying and supporting scalable and high-performance data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing) • Experience with Hadoop Ecosystem (HDFS/S3, Hive, Spark) • Experience in a software engineering, leveraging Java, Python, Scala, etc. • Advanced distributed schema and SQL development skills including partitioning for performance of ingestion and consumption patterns • Experience with distributed NoSQL databases (Apache Cassandra, Graph databases, Document Store databases) • Experience in the banking industry
    Mode & year
    Telephonic
    Location
    Plano, TX
  • add_circle_outline
    Senior Platform Software Engineer
    1 positions Remote
    Skills Required
    Description: Job Title: Senior Platform Software Engineer, Philanthropy Cloud Contract Duration: 01/03/2021 to 01/02/2022 Remote Description: Salesforce Philanthropy Cloud is a brand new B2B2C social impact platform to engage an army of citizen philanthropists. We are seeking for a passionate, hands-on senior backend engineer to help build from the ground up the next-generation marketplace and B2B2C platform for Philanthropy for nonprofits, foundations, corporations and their employees, customers, citizen philanthropists, and other stakeholders using the best of breed technology, social, mobile and AI. Successful candidates will possess extensive software development and delivery experience, having landed large and complex software programs with a proven ability to innovate, lead by example, and code prolifically. Required Skills: • 5+ years of software development experience with prolific coding abilities • Expertise in JVM based languages (Java, JavaScript, Kotlin). Kotlin is nice to have, not required. • Experience in designing and developing GraphQL or REST API based services. GRPC would be beneficial, not required. • Experience in Postgres or other SQL experience required. • In-depth understanding of OOP, microservices design patterns, domain driven design, data structures, algorithms, and concurrency. • Experience building secure, distributed, scalable, high-performance, resilient systems • Proven track record in building products on big data systems such as Spark, Kafka, Storm, and Hadoop • Excellent communication and teamwork skills Nice to have Skills: • Experience in Reactive frameworks such as Akka, RxJava, Vert.x • Experience in building software on Heroku cloud infrastructure • Experience with Agile Development Methodologies and Test-Driven Development • Experience with ElasticSearch Education: • BS, MS, or PhD in computer science or a related field, or equivalent work experience
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    Software Engineer
    1 positions Remote
    Skills Required
    Description: Contract Duration: 01/04/2021 to 01/03/2022 + potential extension Remote Senior Platform Software Engineer, Salesforce.org Payments Salesforce.org Elevate is a suite of integrated offerings for the nonprofit and education sectors that gives organizations a new way to convert visitors into committed advocates. It provides a seamless management interface to engage donors, volunteers, and alumni while streamlining the critical back-end functions of payments, CRM, and accounting. Elevate delivers all the security and scalability you’ve come to expect from Salesforce while considering industry best practices and providing the support you need to grow your fundraising program. We are seeking for a passionate, hands-on senior backend engineer to help rapidly expand the features offered with our Payments service, a critical foundational service that underpins all of Elevate. Payments is built entirely on public cloud, utilizing serverless patterns and concepts to deliver a highly available, elastic set of APIs with low cost of ownership. Successful candidates will possess extensive software development and delivery experience, having landed large and complex software programs with a proven ability to innovate, lead by example, and code prolifically. This is a hands-on role with a high-performing team; expect great challenges and a great opportunity to grow. Required Skills: • 5+ years of software development experience with very strong coding abilities • Expertise in JVM based languages (Java, Kotlin). • Experience in designing and developing REST API based services. • Experience with public cloud infrastructure, specifically AWS, and even more specifically serverless (API Gateway, Lambda, Dynamodb, Streams/Kinesis). • In-depth understanding of OOP, microservices design patterns, domain driven design, data structures, algorithms, and concurrency. • Experience building secure, distributed, scalable, high-performance, resilient systems. • Excellent communication and teamwork skills. Nice to have Skills: • Fintech industry experience • Experience with ElasticSearch and Kibana (ELK) • DevSecOps experience Education: • BS, MS, or PhD in computer science or a related field, or equivalent work experience
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    React Node Specialist
    1 positions Grand Rapids, MI
    Skills Required
    Our client is seeking a React Node Specialist. A day in the life of the React Node Specialist is never the same but, in this role, you will be responsible to provide technical leadership in specialized area of application systems design and user experience for complex company web and mobile based applications. • Responsible for application programming of moderately complex systems within assigned functional area/systems and may be responsible for completion of a phase of a project. • Design, code, test, debug and implement systems, functions, and related applications necessary to meet business needs, working under limited supervision. Work with cross-functional IT areas to coordinate development projects. • May lead the research and fact finding to modify, develop, implement, and maintain moderately complex company applications/information systems; develop detailed functional specifications, process documents and/or workflow diagrams for application programming. • Assist with the development of project scope, objectives, and milestones to meet assigned project definition and requirements. Work with IT Project Managers, Business Analysts and/or business users to provide detail application/system information. • Troubleshoot program issues and interface with appropriate IT sub-departments and/or cross functional business areas to create and provide detailed application/system information to resolve any issues. • Ensure timely documentation of new programs or changes to existing programs. • Maintain current knowledge of industry trends and best Information Technology practices; make recommendations to improve current programs and processes. • May provide guidance and/or training to lesser-experienced Programmer Analysts. • Participate in the on-call support rotation. • Bachelor’s Level Degree – BS In Information Technology or related experience • Five (5) years’ experience • Minimum of 3 years JavaScript based web application development experience; React, Node.js, MERN stack experience preferred. • Build reusable components and libraries, for front-end and back-end functions • Translate high fidelity designs and wireframes into high quality code • Rock solid understanding of state management using Redux • Knowledge of functional programming • Ability to write well-documented, clean code • Experience with third-party dependencies and debugging dependency conflicts • Understanding of REST APIs, the document request model, and offline storage • Experience with automated testing suites like Jest • React.js experience preferred • CI/CD experience preferred • Plan for and work flexibly to deadlines • Identify options for potential solutions and assessing them for both technical and business suitability • Proven source code tooling experience, bitbucket preferred (ability to feature branch, merge, pull, push) • Work collaboratively with peers, performing code reviews and pull requests • Experience working in an agile team • Strong verbal and written communication skills; ability to communicate IT programming in a non-technical manner. • Must have strong organizational, prioritization, analytical and problem- solving skills. • Must be detail oriented and have strong project/time management and research skills. • Ability to interact with management. Skills Functional Programming Node.js REST APIs Bitbucket JavaScript Continuous Integration and Continuous Delivery (CI/CD) Redux.js MERN Stack React.js
    Mode & year
    Telephonic
    Location
    Grand Rapids, MI
  • add_circle_outline
    IBM OMS Consultant
    1 positions Remote
    Skills Required
    Hello , Hope you are doing great. Job Title : IBM Sterling OMS consultant Duration : 6-12 months Job Description: . 6- 8 year of experience in IBM Sterling OMS • At least 5 year of experience in software development life cycle. • At least 5 years of experience in translating functional/non-functional requirements to system requirements. • Experience in Configuration and Customization of Sterling OMS • Experience in working with Sterling Web SOM & Web COM modules • Experience in creating Technical design documents for Sterling OMS • Experience and understanding of in Production support and performance engineering. • Technical Skills - Knowledge of Java, XML, XSLT • Ability to work in team in diverse/ multiple stakeholder environment and has client interfacing skills. • Analytical skills • Experience and desire to work in a Global delivery environment Must Have Skills Sterling OMS Providing Architectural solution Providing Architectural solution Nice to have skills Sterling OMS Providing Architectural solution Detailed Job Description In TSC we are executing 78 critical projects to handle mitigate current COVID situation and store closures. As part of this critical initiative, there is a requirement for OMS Senior Technology Architect across all these tracks. This position is also critical for starting a new track. Minimum years of experience 5+ Top 3 responsibilities you would expect the Subcon to shoulder and execute Sterling OMS Providing Architectural solution Providing Architectural solution
    Mode & year
    Telephonic
    Location
    Remote

SAY HI TO US

USA

phone (425)533-9681

2018 156th Ave NE
Suite 100 Building F
Bellevue, WA 98007

India

phone 91-40- 66638886

Satya Sadan
3 rd Floor
Saifabad
Hyderabad, TS 500004

Denmark

phone (425)533-9681

Skolevej 7
2980 Kokkedal
Denmark