WHO WE ARE + WHAT WE DO

nFolks Data Solutions renders Information Management solutions, learning services and staffing services to clients globally, from offices located in North America, Europe, and Asia. Comprehensive approach to provide indispensable Data Solutions helps us deliver the best to our global clientele. As a market leader, we focus on Information Management solutions to align measurable business goals and objectives with information technology.

Aware of the current market demands and requirements, we work towards utmost customer satisfaction. Our pragmatic outlook has resulted in venturing into new markets and witness steady growth over the years. We have unparalleled system of- services, support, and development –which helped us in successfully completing several end-to- end implementations of IBM InfoSphere Information Management projects.

Our expertise is in Architecture, Analysis, Design, Development, Administration, Training, Maintenance, and Support using IBM InfoSphere applications such as DataStage, QualityStage, Information Analyzer, Metadata Workbench, Business Glossary, and FastTrack. We top the market in IBM InfoSphere solutions -conscious of Quality, Price, and Certified IBM Solution Developers. To build successful Data Solutions you need the best talent in tech field; consequently all our Architects, Team Leads and Senior Developers are IBM Certified Solution Developers. Our IT personnel participate in company organized education and training classes along with real time experience allows us to offer our clients an entire team of experts.

Watch Video

What we offer to our customers

Information Management Consulting Services

Our certified solution developers deliver successful Data Solutions and specialize in end to end implementation of Information Management projects. We strive not only to meet the client’s requirements but also exceed their expectation recurrently.

Know More

Training Services

We are preferred partners with training companies who provide learning services to global clients. Our presence throughout North America, Europe, and Asia aid us in achieving customer satisfaction.

Know More

Staffing Services

Conscious of the client's demands; we aim to nurture their requirements with our staffing services in order to provide high-caliber teams and IT professionals.

Know More

THE BEST BRANDS TRUST US

WE HELP OUR CLIENTS STAY AHEAD OF THE COMPETITION

We work with a wide range of clients, from top organizations to medium and small scale businesses.

OUR PEOPLE MAKE US GREAT

WHAT OUR CUSTOMERS SAY

format_quote
format_quote

nFolks Case Studies

Client: Louisiana Department of Corrections
Project: DB2 Migration from Legacy Systems

Louisiana Department of Corrections implemented the DB2 migration by creating the DB2 Data Store as their first IT enterprise modernization. The department planned to extract the data from their existing Mainframe Mapper and Notes legacy databases and move the data to DB2 database as the basis for the enterprise “database of record” after going through the compete cycle of profiling, cleansing and transforming the legacy data for web-OLTP using IBM Information Server Suite of products as well as query and reporting using IBM Cognos BI tool.

The issue for the client was extracting the data from legacy hand written very old databases such as Mapper and Notes for different database objects/tables & applying rules on top of it to load data into DB2 Database. All this was targeted to be completed in limited budget and fixed timeline. In addition, the Information Server was deployed on Windows Operating system with limited resources sitting on a VMWare for which support model from vendors was not existing. As these were legacy databases parent/child relationship between tables was also not present. As a result this check needs had to be made in ETL code and records which did not meet the requirement had to be captured and emailed to Data Owners.

nFolks Data Solutions took this as a challenge and used IBM Information Server suite of tools provided by client by coming up with different solutions such as using the correct ODBC drivers as extracting data from legacy database was not easy with limited drivers supporting it partially. nFolks used a combination of Parallel and Server jobs as it was creating lot of processes on windows server & parallel extract or huge extract would break the connection with legacy databases Scripts were written to monitor the hanged processes & clean them frequently as the no of resources are limited. The data was also Unicode and special characters had to be handled carefully so that data retains its original form and loaded into DB2 database.

Following the implementation, the department made the process seamless. Programs & scripts written by nFolks Data Solutions made the process seamless and in time delivery of project within limited budget by addressing all issues which made the migration of legacy databases to DB2 database successful.

Client: Major Financial Institution in the US
Project: Upgrade of ProfitMax NextGen Solution

ProfitMax NextGen objective was to calculate Profit and Loss metrics for the wholesale line of businesses at various levels like Customer, Relationship, Officer, Accounting Unit (AU), and Organizational Unit (OU) by building a Profitability Data Warehouse. The ProfitMAX NextGen is a rewrite of current legacy application, which is a customer and product profitability tool used by the Wholesale Banking division of the client to view customer / product profitability. 3000 users were estimated to access the solutions and would enable them to view and measure profitability at an officer, AU and product level.

The issue for the client was that the Data Warehouse modeling was done with very poor practices & no best practices had been in place by previous implementation partner, leading to large number of issues, resulting in inaccurate data with missing relationships between target tables. The issues resulted in an unstable code and the reporting results were affected.

nFolks Data Solutions, along with another implementation partner, were requested to fix the issues caused by the previous poor implementation. With best Solution Developers and Architects being deployed with right strategy, we were able to fix the code issues as well as the design issues & developed a solution which resulted in generating reports at 20 levels down granular reports across the whole sale line of Business Services and products for the financial institution, thereby meeting the primary end-point requirements. With the implementation, client was able to save lot of money and got and effective solution, which resulted in generating reports granular reports for 20 different at 20 levels, thereby providing streamlined data for improved decision making.

Case 3: Automotive Distributor in the US
Project: Migration from DataStage 7.0 to DataStage 8.7

Client is a distributor for a large Japanese automotive company in the US. Client operates with 150+ dealerships in 5 states. Data Migration project was one of the high priority projects for the client wherein they were migrating their systems from IBM InfoSphere DataStage 7.0 version to IBM InfoSphere 8.7.

Client wanted to migrate the code from DataStage 7.0 version to DataStage 8.7 version but most of the jobs were server jobs with few plug-ins being used which were no longer available in DataStage 8.7 version and these needed to be converted to parallel jobs. Overall Architecture assessment was based on different factors such as volume assessment, server assessment, topology design, resource estimation, performance tuning before installing the DataStage 8.7 version and migrating the code from lower version to higher version.

nFolks Data Solutions performed server assessment and upgraded the code from DataStage 7.0 version to DataStage 8.7 version by converting all the server jobs to parallel jobs whose plugins were depreciated. The migration was successful with minimal disruptions to the operations of the client. The implementation helped the client achieve Business Continuity by staying abreast of the latest data technologies.

Case 4: Major Airline in the US
Project: Cleanroom Project for Sales Data Analysis

Cleanroom project was a sales merger project between two major US based airlines, which were in the process of merging their operations, including sales data. The target for the implementation was to generate market shares of the two airlines with reference other airlines for improving sales.

To major airlines in the US decided to merge. Prior to the completion of the merger, the airline companies were not allowed to run combined sales reports, and the same was outsourced to ZS Associates where nFolks Data Solutions was asked to help on the ETL part for loading sales data across different routes, sources, destinations, legs, seats and others compared to other competing airlines.

nFolks Data Solutions used various data tools to assimilate data from the two airlines while adhering to legal requirements for data privacy prior to the merger. Following the solution provided y nFolks, client was able to seamlessly generate market share reports for improved business planning for activities such as route optimization and sales planning.

Case 5: Major Power Utility in the US
Project: Smart Connect Data Warehouse Project for Consumer Generated Data

Smart Connect Data Warehouse project is one of the high priority projects for client connecting large number of data points from their customer meters. The Metadata Warehouse repository is one of the important modules of this project being implemented to generate different reports related to impact analysis, data lineage and Business lineage.

The client, as part of Smart Connect project, wanted to implement the Metadata Warehouse repository across systems to find the impact analysis for any field changes, find data lineage and business lineage using IBM Business Glossary, Metadata workbench and other IBM foundation tools. The challenge was to integrate the solution with the large number of lot of end-to-end systems including vendor technologies such as SAP Business Objects reports, Erwin data models, Teradata, Oracle, DataStage code, stored procedures, extended mappings and others.

Representing IBM at client site, nFolks Data Solutions implemented the project by importing all the metadata assets using IMAM for Erwin data models, Teradata, Oracle, DataStage ETL jobs and others. nFolks also created Business Categories, terms and other Business assets using Business Glossary, and mapped the IT assets to Business assets in Business Glossary to generate data lineage, business lineage & impact analysis using Metadata Workbench. Worked closely with IBM Israel Labs to address the product issues, we offered help in including right features which would provide more granular information for the client.

Client was able to find impact analysis if any IT asset such database column or ETL jobs, generate end to end lineage from source system to SAP BO reports, generate business lineage. In addition, both function & technical teams are able to use Business Glossary.

Case 6: Major Retail Company in the US
Project: Data Migration from Disparate Systems to SAP ECC 6.5 System

Client was a holding company for popular retails clothing and accessories brands in the US, and was merging the brands under a single entity for better business management and decision making. The merger process was to be completed in two phases and all the major brands under the portfolio were being migrated to upgraded systems.

Client wanted to merge all the 8 brands in the portfolio in two phases by migrating data from different sources such as SAP R/3 4.7, files and DB2 to SAP ECC 6.5 so that these brands are managed from one SAP ECC system of record.

Representing IBM at client site, nFolks Data Solutions implemented the project by creating the DataStage jobs with best practices and performance tuning steps to merge the data for different brands from SAP R/3 4.7 System to SAP ECC 6.5 system. Following the implementation, client was able to access all the brands data from one SAP ECC 6.5 system as opposed to individual systems or different system of records for different brands

Case 7: City of New York HRA
Project: DataStage Upgrade to DataStage 11.3

City of New York HRA was upgrading their infrastructure from older versions of DataStage to DataStage 11.3 which was implemented by IBM and as a part of the project, which was a very short engagement, worked closely with NYC HRA to Install and configure IBM InfoSphere Information Server 11.3.

City of New York HRA was upgrading their infrastructure from older versions of DataStage to DataStage 11.3 which was implemented by IBM and as a part of the project, which was a very short engagement, worked closely with NYC HRA to Install and configure IBM InfoSphere Information Server 11.3.

Case 8: Major Financial Institution in Australia
Project: Implementation of IBM MDM v11.6

Implementation of IBM MDM v11.6, DSC, RESTful, Bamboo, Maven & Oracle 12c, Kafka for major financial institution in Australia

The implementation consisted of two parts: The Global Registry (TGR) and Australian Operational Customer Master (AOCM).

The Global Registry (TGR) was a combination of virtual MDM (Matching Hub - MH) and physical MDM (Operational Hub - OH). MH provides matching entity link based on outcome of algorithm defined with probabilistic matching, while OH persists with the member data for enriching additional data such as hierarchy and relationships. Data changes for every member are notified to downstream systems by triggering notification messages. nFolks designed and developed 'maintain party service‘, 'publish derived record’, 'data model' and 'algorithm' to accommodate entity attributes. Configured Bamboo jobs for automating build, deploy and test processes.

Australian Operational Customer Master (AOCM):

AOCM is a master slave database to consolidate customer data from multiple data sources and to serve the required data to downstream applications or systems. Java Server Faces web framework-based Data stewardship application is customized to integrate the new attributes, and used for searching customers based on user-defined criteria. AOCM is capable of emitting event notifications for changes to 'party profile' and 'customer-to-account relationship' to downstream systems.

On behalf of IBM, nFolks designed and developed standalone Java application – 'Missing Data Remediation' for remediating missing data in AOCM or in source systems, and also migrated source control repository to GIT and validated the generated EAR works. In addition, nFolks also upgraded AOCM from MDMv10.5 to MDMv11.5, and included Oracle migration from 11g to 12c. By ensuring that AOCM MDM migration did not affect functionalities, customer data and web services, nFolks ensured business continuity for the client with minimal disruptions.

WORK WITH US AND GROW YOUR CAREER

  • add_circle_outline
    DataStage Admin
    Salary No Constraint 1 positions Noida
    Skills Required
    6+ Years of Relevant Experience in Data Stage Administration
    Mode & year
    Contract To Hire
    Location
    Noida
  • add_circle_outline
    PL/SQL Developer
    INR 6.5 lacs 2 positions Hyderabad/Bangalore/Kolkata
    Skills Required
    6+ years of Relevant experience in Pl/SQL development with ETL Experience
    Mode & year
    Contract To Hire
    Location
    Hyderabad/Bangalore/Kolkata
  • add_circle_outline
    DB2 Admin
    Salary No Constraint 1 positions Noida
    Skills Required
    6+ yrs of Relevant Experience in DB2 Administration
    Mode & year
    Contract To Hire
    Location
    Noida
  • add_circle_outline
    Qlick View Developer
    Salary No Constraint 2 positions Pune
    Skills Required
    4+ years of Experience in Qlick View and Qlick Sense
    Mode & year
    Contract To Hire
    Location
    Pune
  • add_circle_outline
    Informatica Developer
    INR 11 Lacs 6 positions Hyderabad
    Skills Required
    6+Years of Informatics Development
    Mode & year
    Contract To Hire
    Location
    Hyderabad
  • add_circle_outline
    DataStage Developer
    INR 13 Lacs 6 positions Hyderabad
    Skills Required
    6+years of Relevant Experience in DataStage with TeraData
    Mode & year
    Contract To Hire
    Location
    Hyderabad
  • add_circle_outline
    DB2 Developer
    INR 15 Lacs 6 positions Hyderabad
    Skills Required
    6+ Years Relevant experience in DB2 Development
    Mode & year
    Contract To Hire
    Location
    Hyderabad
  • add_circle_outline
    Data Engineer with Azure
    DOE 1 positions Remote/Philadelphia, PA
    Skills Required
    Data Engineer with Azure Work location – Remote for this year and Philadelphia, PA post the current situation Long Term Skills and Qualifications: • Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required • 1+ year experience working with Azure analytical stack • Deep experience with building analytical solutions on Azure SQL DW (or Azure Synapse) • Strong knowledge in Cost Allocations, Pricing and Profitability analysis is required for this position • Experience with Delta Lake is a nice to have but experience with Azure Data Lake is required • Experience with Databricks is a nice to have • Experience building high-performance, and scalable distributed systems • Continuous Data Movement/ Streaming/ Messaging: • Experience with related technologies ex Spark streaming or other message brokers like MQ is a PLUS • 3+ years’ experience developing, deploying and supporting scalable and high-performance data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing) • 3+ years’ experience in a software engineering, leveraging Java, Python, Scala, etc. Thanks arun@nfolksdata.com
    Mode & year
    Phone
    Location
    Remote/Philadelphia, PA
  • add_circle_outline
    Data Engineer (AWS, Snowflake, Batch ETL tool):
    DOE 1 positions San Antonio TX
    Skills Required
    Data Engineer (AWS, Snowflake, Batch ETL tool): Location: Remote (flexible)/TX after covid Long Term Skills and Qualifications: • Data Engineer with Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required • 1+ year experience with Snowflake database • AWS cloud experience (EC2, S3, Lambda, EMR, RDS, Redshift) • Experience in ETL and ELT workflow management • Familiarity with AWS Data and Analytics technologies such as Glue, Athena, Spectrum, Data Pipeline • Experience building internal cloud to cloud integrations is ideal • Experience with streaming related technologies ex Spark streaming or other message brokers like Kafka is a plus • 3+ years of Data Management Experience • 3+ years of batch ETL tool experience (DataStage / Informatica / Talend) • 3+ years’ experience developing, deploying and supporting scalable and high-performance data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing) • 2+ years’ experience with Hadoop Ecosystem (HDFS/S3, Hive, Spark) • 2+ years’ experience in a software engineering, leveraging Java, Python, Scala, etc. • 2+ years’ advanced distributed schema and SQL development skills including partitioning for performance of ingestion and consumption patterns • 2+ years’ experience with distributed NoSQL databases (Apache Cassandra, Graph databases, Document Store databases) • Experience in the financial services, banking and/ or Insurance industries is a nice to have Thanks arun@nfolksdata.com
    Mode & year
    Phone
    Location
    San Antonio TX
  • add_circle_outline
    System Engineer
    INR 11 positions Bangalore
    Skills Required
    Python Programming
    Ansible
    Linux
    Messaging Queue
    Docker & Kubernetes
    Job Description
    • Experience running RHEL infrastructure
    • Experience working with Ansible
    • Experience with Message Queues (Kafka preferred)
    • Experience with PostgreSQL Databases
    • Experience with Redis Caching
    • Experience with Docker deployments (Docker Swarm, Kubernetes, Openshift)
    • Experience with a version control system (GIT preferred)
    Mode & year
    Contract o Hire
    Location
    Bangalore
  • add_circle_outline
    Site Reliability Engineer
    INR 6 positions Bangalore
    Skills Required
    Scripting : Perl, Python,
    Repository : GITHUB
    AWS
    Ansible/CHEF
    SPLUNK, ELK
    Job Description
    Required Skills:
    · Minimum of 5 years’ experience in hands-on production administration of large system environment
    · Experience in establishing, following, and improving upon procedures within a mission critical environment
    · Must be efficient in writing scripts
    · Must be extremely comfortable using and navigating within a Linux environment
    · Musts have the ability to do high level debugging and problem analysis by examining logs and running Unix commands
    · 2+ years Experience with github, perl and python
    · Excellent written and verbal communication skills
    · Comfortable operating in fast paced environment
    · Understands how DNS works
    · 4+ years or more experience in virtualization environments such as AWS / Softlayer/Zen/VMWARE
    · 2+ year’s Experience with configuration management systems (SALT/Ansible / Chef)
    · 2+ year’s Experience using splunk and or ELK
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Automation Test Engineer
    INR 4 positions Bangalore
    Skills Required
    Python
    Shell Scripting
    CI/CD
    Docker/Kubernetes
    Cloud
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Golang Developer
    INR 3 positions Bangalore
    Skills Required
    goLang (Mandatory)
    Microservices Development and understanding of it (Mandatory)
    Expert in Kubernetes (Mandatory)
    Knowledge of IaaS, Dockers, Helm
    Expert in Implement REST/HTTP/RAML/Swagger
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Control Plane Infrastructure
    INR 1 positions Bangalore
    Skills Required
    Required Technical and Professional Expertise
    A track record of building enterprise systems
    Strong debug skills, effective verbal and written communication skills, team oriented Minimum of 2 years’ experience with Kubernetes deployments and Kubernetes administration
    Minimum of 5 years’ experience programming using GoLang, Python, or C++.
    Minimum of 5 years' experience with Linux operating systems
    Minimum of 5 years’ experience with Agile team project delivery practices
    Preferred Technical and Professional Experience
    2 years' experience with large scale software system deployments 5 years’ experience using Container management technology such as Kubernetes and Docker Experience with IBM Cloud Platform
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Site Security Engineer
    INR 3 positions Bangalore
    Skills Required
    Areas of Focus:
    Process Development and Orchestration
    Integration with existing Business systems and technical platforms to operationalize security and compliance gaps.
    Exposure to IT Security Management processes and procedures
    Network knowledge regarding ACL and VLAN configuration management
    Enterprise Password Management solutions
    Orchestrating solutions to address security control gaps
    Relevant Work Experience:
    At least 2 years experience with Cloud based solutions
    Experience developing, implementing, and operating large-scale IaaS, ultra-highly available and highly secure cloud environments/services
    Supporting application deployments, building new systems and upgrading and patching existing ones.
    Operating the cloud infrastructure and services within our security and privacy guidelines and compliance needs
    Implementing logging, auditability, security, and monitoring features for cloud services and infrastructure assets.
    Minimum 3-5 years in systems administration/Software Engineering/DevOps, networking in a large environment.
    Computing: Strong programming experience C, C++, Java, Shell, Perl, GO, Ruby, PowerShell, ASP.net v4+ or Python
    Specific Knowledge: Software Engineering (GIT, Jenkins), Networking (Protocols, Load balancing, troubleshooting), deployment & configuration management (Chef, Puppet, SaltStack, Ansible, or NPM), Linux (RHEL,SLES), Containers (Docker, LXC), Monitoring (Kibana, Elastic Search), Cloud systems and Virtualization (CloudStack, OpenStack, AWS, EC2, Xen, KVM, or OpenStack),
    Database (Oracle,MySQl,MariaDB,Cassandra,S3,HBase,Hadoop,MongoDB, or CouchBase), Security tools (Nessus, Vault)
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    Test Developer
    INR 3 positions Bangalore
    Skills Required
    Python Programming
    Django
    Mongodb
    HTML
    CSS
    Java Script
    Mode & year
    Contract to Hire
    Location
    Bangalore
  • add_circle_outline
    IBM Data Governance
    2 positions San Antonio TX
    Skills Required
    Responsibilities: • Provide governance solution architecture and consultancy around operational principles, solution design, tools selection, integration, and administration • Drive new initiatives for Governance Playbook • Develop framework for operationalizing data governance across business and IT landscapes, for different phases of project • Build prototypes and integrated demos for metadata exchange to evaluate tool capabilities for supporting the metadata architecture • Helps define the Data Quality management methodology, tools and principles • Provide central governance, data asset collection and administration functions for production Information Governance platform • Evangelize governance and enterprise governance across business verticals and build trust with business and IT stakeholders • Helps verticals centrally author, store, steward, and serve familiar language on the central production governance server • Setup processes for cataloging technical assets of the analytical and operational stores • 10+ years of experience in Information Management including data design, ETL, data quality and metadata. • Experience in developing Data Governance and Data Management strategy, road map(s) including experience defining, implementing, and operating a data governance bodies, including typical data governance policies, processes, and standards • Knowledge of industry leading data quality, master data management and data protection management practices • Knowledge of data related government regulatory requirements and emerging trends and issues • Demonstrated consulting skills, with change management concepts and strategies, including communication, culture change and performance measurement system design • Knowledge of risk data architecture and technology solutions • Strong technical experience with IBM’s IGC
    Mode & year
    Telephonic
    Location
    San Antonio TX
  • add_circle_outline
    Power BI Analyst
    1 positions Grand Rapids, MI
    Skills Required
    Our client distributes grocery products to independent and chain retailers in 50 states and our own corporate-owned retail stores throughout the Midwest, in addition to fresh food processing and distribution. Through our MDV military division, we are a leading distributor of grocery products to U.S. military commissaries. Our client's Business Intelligence team is growing with new technology and is looking to add a Sr. Business Intelligence Analyst. Preferred Skills: Willingness and Flexibility to work w/ a team that is rebranding their Corporate Business Intelligence offering Strong business acumen SQL Data warehouse Retail Industry Business Intelligence - Report Development Tableau ETL - good understanding Ideal Skills: BI Reporting MS PowerBI - 3-5 years Data warehouse Snowflake/Cloud/MS Azure DevOps
    Mode & year
    Telephonic
    Location
    Grand Rapids, MI
  • add_circle_outline
    Sr. Oracle Developer with Informatica
    1 positions Minneapolis
    Skills Required
    We have an urgent requirement of a very strong SQL resource. Full JD as below Strong experience with Oracle Sql and plsql (9.x, 10.x, 11.x) as developer Experience on ETL with Informatica. Experience with UNIX shell scripting. Excellent Communication skills (oral and written).Experience in onsite/offshore model. Good to Have- Experience working in Tivoli work scheduler is preferable. Experience in conversion projects is preferable. Experience in working with SAP modules (ECC, CRM or GTS) as source or target is preferable. Develop good quality code with minimum maintenance. Strong architect (design) and technical skills. Ability to translate business needs into technical solutions. Excellent analytical and problem solving skills. Should work with multiple teams including BI, Data cleansing etc as needed. Should have good documentation skills including creating SOP, conversion execution document, and technical design document as needed.
    Mode & year
    Telephonic
    Location
    Minneapolis
  • add_circle_outline
    DataStage Developer
    1 positions MN or GR-Remote
    Skills Required
    Need Cloud ETL tool Talend highly Our client distributes grocery products to independent and chain retailers in 50 states and our own corporate-owned retail stores throughout the Midwest, in addition to fresh food processing and distribution. Through our MDV military division, we are a leading distributor of grocery products to U.S. military commissaries. 5 years' experience DataStage V9.1 and 11.7 Must haves : DataStage conversion from 8 to 9 & 11 - converting all systems : Experience working w/ Interfaces : Prefer someone to work in MN : Remote could be an option if strong comm skills : Hierarchical stage in DataStage is used to parse or compose XML (Extensible Markup Language) and JSON data. This stage was introduced in Version 11.3. When we have huge amounts of data to work with, then Hierarchical stage is preferred over XML packs : Strong verbal and written communication skills; ability to communicate IT programming In a non-technical manner. : Must have strong organizational, prioritization, analytical and problem solving skills. : Must be detail oriented and have strong project/time management and research skills. Working knowledge of MS Office, MS Project and Visio. : Must have strong knowledge of Datastage (ETL tool) with background In development of Datastage solutions. : Strong relational database and SQL skills required, with MS SQL/Server preferred. An understanding of DB2 or lnformix is a plus. : Experience working with Transportation application software a strong plus, with knowledge of routing (JDA/Manugistics) and on board computing (Cadec) applications preferred.
    Mode & year
    Telephonic
    Location
    MN or GR-Remote
  • add_circle_outline
    Medical Device Software Design Engineer
    1 positions Remote
    Skills Required
    MAJOR RESPONSIBILITIES: • Strong C/C++ coding experience in an embedded/RTOS environment • Port ACM interface with cellular modem to ECM • Contributing to software design decisions • Works independently with wide latitude for independent judgment. • Working closely with systems and software engineers to define interfaces / Generating software requirements • Working closely with hardware engineers to define platforms and board bring up • Working closely with software architect on high-level software design • Documenting software detailed design • Implementing software and unit tests according to design documentation • Supporting software integration on the hardware • Troubleshooting systems integration and software • Ensuring deliverables and work products adhere to the quality system BASIC QUALIFICATIONS • Bachelors Degree in applicable engineering discipline • 8 years applicable experience • Strong C/C++ coding experience in an embedded/RTOS environment • Experience with Cellular communications, preferably ECM • Experience with Linux, VxWorks, Nucleus or other embedded Operating Systems • Ability to work in a regulated environment PREFERRED/DESIRED QUALIFICATIONS • Previous mentoring or lead experience • Demonstrated ability to determine and meet project objectives. • Experience with one or more POSIX-compliant real-time operating systems • Hands on experience with ARM processors (e.g. OMAP, i.MX) and microcontrollers (e.g. MSP430, ARM, PIC) • Demonstrated understanding of and ability to follow good software development processes • Able to work well in an open team environment and with outsource partners. • WORKING CONDITIONS: Office environment, extended hours as needed, currently working remotely No travel
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    Lead Cloud Engineer (Azure
    1 positions Remote
    Skills Required
    Our client is a nationally recognized not-for-profit health system, offering a full continuum of health services through their health plan, medical group and hospital group. Our client is looking for a Lead Cloud Engineer (Azure) Strong chance to extend, if someone is local we could look for full time hire in the future. Remote work, travel NOT expected. No Subs This person will help us drive the future state of Cloud for the organization, must be a very strong candidate. • Ability to administer and manage resources and identities in Azure cloud platform • Ability in dev ops engineering for cloud resources and data science applications • Ability to set up and troubleshoot data science development toolkits such as Git, Airflow, Docker and PyCharm • Proficiency in the following languages: Python, Shell Scripting, SQL and Spark • Familiarity with data architecture, engineering, and governance concepts • Familiarity with machine learning concepts Qualifications – Required • 7+ years designing and administrating cloud infrastructure (Azure required) • 5+ years providing dev ops support for cloud platforms and data science applications Qualifications – Preferred • Experience in building big data pipelines using Spark • Experience in ETL development, data warehousing • Experience in machine learning • Experience in Airflow and Docker • Experience in healthcare industry
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    Sr. AEM Developer
    1 positions Grand Rapids, MI
    Skills Required
    Key Responsibilities: • Designing and developing web applications using the Adobe platform, including guidance of site structure, components, templates, workflows, dialogs, object model designs (Java APIs), and unit testing using AEM architecture (CRX, OSGI, JCR) • Setup and configure AEM authoring, publish, and dispatcher environment with Adobe recommended best practices. • Integrate AEM with other marketing products like Assets, Target, Campaign, and other internal endpoints. • Work closely with Vendor partner to ensure sound practices regarding site architecture, performance and reliability, and content delivery are in place. • Work in SAFe agile development Methodology. • Work with front-end technologies and frameworks. • Follow best practices for secure web programming and deployment. Adhere to internal best practices with respect to coding standards, unit test coverage, automation, and continuous integration. • Develop alongside an Offshore team of AEM Developers Required Skills & Experience • Adobe AEM Developer with strong Java/J2EE background in both front-end web design (React js) and AEM integration. • 3+ years of AEM 6.x/CQ5 experience • Sound understanding of all AEM building blocks including templates, components, dialogs, widgets, social components etc., code build and deployment process • 2+ years of UI development experience with ES6 Java Script and CSS preprocessors(LESS, SASS). • 2+ years of strong web content management experience with Adobe AEM/CQ5. • 3+ years of Java development and familiarity of frameworks such as OSGi • Experience developing reusable AEM components for authoring content, reusable code libraries, unit testing, automation and code walkthroughs • The ability to present technical concepts to technical and non-technical internal/external stakeholders • Ability to write clean, modular, reusable code (using design patterns) and experience with unit-test driven approach to development
    Mode & year
    Telephonic
    Location
    Grand Rapids, MI
  • add_circle_outline
    OFSAA Data Architecture
    2 positions San Antonio TX
    Skills Required
    Required: • Proficient in OFSAA Data Architecture • Knowledge of Data Modeling using Erwin tool, Data Migration Activities and Slowly Changing Dimension (SCD) Component • Extensive knowledge in creating Oracle Packages, Procedures, Functions, Views, Triggers and Queries using Oracle SQL-PL/SQL • Core Java knowledge • Database design, Performance tuning, development and integration using Oracle • UI experience with OFSAA forms framework, HTML, JSP and CSS3 • Experience with jQuery and Angular JS • Experience with WebSphere or Tomcat • Experience with Maven and Jenkins • Experience with agile project methodology • Understanding of OFSAA Reconciliation Framework • OFSAA metadata management framework (Hierarchies, datasets)
    Mode & year
    Telephonic
    Location
    San Antonio TX
  • add_circle_outline
    Adobe Developer
    1 positions Remote
    Skills Required
    Our client, a family owned Midwestern grocery/retailer that strives to better the lives of people in all communities is in need of an Adobe Developer. Adobe Developer (all Marketing Cloud products) • HTML,XML,CSS and Javascript • Experience in developing work flows, content blocks and custom resources in Adobe campaign • Proficiency in SQL and ETL development • Knowledge of AEM, Target and Audience manager products and their integration with Adobe campaign • Designing solutions and writing technical documentation for a commercial website
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    Sr. Java Developer
    1 positions Dallas TX
    Skills Required
    We are looking for candidates that have previous airline experience and preferably are local to Dallas. They just need to be very strong in Java and communication plus good job tenure. Airline experience is required. Sr. Java Developer Dallas TX Long Term Top Skills: • Java • Spring • Akka Description The mission of the Sr. Java Developer is to develop and deliver software in the Operational Data Store, API’s, and other development work within the Aircraft MX organization. This is a large development team that will do some paired programming in an open SAFe development Environment. This team also partners and with the Development teams within the train to deploy these applications into AWS What you will do: • Analyze, Design, program applications, API’s, and integration code as it pertains to the MX applications, and the ODS product • Implement functionality within a well-integrated application system in accordance to the partner requirements, organizational methodologies, and standards • Partner with the Product Owner and SCM to ensure the work package is delivered on time and on scope. • Respond production problems and implement immediate resolution efforts to the team and dev Teams • Mentor less-experience developers on this team and other teams• Explore and provide feedback for various technologies Must Haves: • 6+ years of designing, developing, and implementing Java-based software in an Agile team environment • Experience consuming and creating Restful and Micro-Web Services.• Experience developing and deploying code into AWS using Cloud Native tools and processes .• Experience with XML, XSLT, and messaging • Experience with Jenkins
    Mode & year
    Telephonic
    Location
    Dallas TX
  • add_circle_outline
    Lead MS Dynamics 365 Developer
    1 positions Rhode Island
    Skills Required
    - need a very senior level Lead MS Dynamics 365 Developer (10+ years’ experience) for a long term contract MS Dynamics 365 - senior LEAD developer with experience migrating from on-prem to 365. - experience with C# for plugin/workflow activity development, as well as JavaScript for Xrm (specific to CRM) development - experience with SSIS and KingswaySoft and would be nice to have custom SSRS report design/development using FetchXML
    Mode & year
    Telephonic
    Location
    Rhode Island
  • add_circle_outline
    API Architect
    1 positions Benton Harbor MI 49022
    Skills Required
    Job Description Our client is seeking a smart, driven and talented individual who is not only a thought leader in the API space but can balance the diversity of skills a fast-paced consulting environment requires. This resource will act as principally as an API architect with oversight for functional / technical and developer resources in our clients Go To Market space. Responsibilities include a combination of the following: API Functional and Technical Architect (70%) • Lead API strategy / roadmap initiatives while aligning to business objectives • Work with cross-functional partners to define the functional and technical API roadmap, strategy and solution, evolution from SOA to APIs driven by business requirements • Design, implement, test and deploy APIs using the latest technologies and best practices • Implement API management software (like Apigee or others) to include API proxies, mashups, rate limiting, security, analytics, monetization and developer portals • Includes helping establish API organizations, API development methodology and selection of API technology components including API gateway • Manage project, client and vendor relationships, team, finances • Lead / support engagement and interface directly with Trade customers • Agile project delivery • Communicate risks, issues, challenges and status/progress of the project • Document requirements, design, architecture and test scripts • Participate in thought leadership, vendor relationships • Lead or assist in responding to RPFs / proposals • Other duties as assigned API Developer (30%) • Assist technical developers and solution architects in understanding and adopting the API roadmap, strategy and solutions, and drive evolution from SOA to APIs and in building APIs • Participate and / or lead a team of developers in delivering large and complex solutions • Design, implement, test and deploy APIs using the latest technologies and best practices • Implement API management using API management software (like Apigee or others) to include API proxies, mashups, rate limiting, security, analytics, monetization and developer portals • Document requirements, design, architecture and test script • Other duties as assigned Requirements • 3-5 years of experience with an API management / gateway software WSO2 or like Apigee or Layer 7 or Mashery or equivalent with the knowledge of API mashups, analytics, developer networks, monetization models and security • 7+ years of development experience using Java or .NET, XML, JSP or ASP, JavaScript or VB Script and Node.js; WSDL/SOAP/XML and REST/JSON, Swagger or equivalent • 2-3 years of project management and technical architecture experience in designing integration architecture for Trade Partner/ Retailers / Builders / Marketing / DCE platforms • Ability to work in a virtual team across distance (remote) cultures and time zones, in a matrix with multiple reporting lines, with on-shore and off-shore resources, and may extend outside the Company organization including suppliers, partners and trade customers. • Experience in HTTP/HTTPS protocol, web services and use of web servers • Knowledge in API design and implementation and system integration using APIs • Experience in working on the cloud (iPaas / SaaS) • Experience in managing cross-functional teams, business partners, vendors, and related finances • Reviews system test plans and can perform robust system tests before implementation • May require up to 10% travel and face to face interaction with Trade customers and their technical resources / teams • Willing to relocate to Benton Harbor, Michigan Preferred Qualifications: • Experience of working/knowledge on SAP PI/PO platform and SAP Cloud is plus • Knowledge in designing relational databases and SQL scripting • Knowledge of Cassandra, Hadoop/MapReduce, MongoDB is plus • Knowledge of implementing security and identity protocols such as OpenID, TLS, OAuth, SAML, SSO, PKI (encryption and key management) • Knowledge of EDI, ALE, IDOC is helpful • Knowledge of SAP Payment Card Processing and SAP Interfaces to Paymetric Adaptor Server (PAS) Experience with product development, manufacturing and supply chain concepts (e.g. KANBAN, LEAN)
    Mode & year
    Telephonic
    Location
    Benton Harbor MI 49022
  • add_circle_outline
    Test Lead
    1 positions Remote
    Skills Required
    Test Lead Remote (Some travel to Denver, CO on need basis) Long term Contract For this role, telecom and wireless network roll-out experience are key, so generic test manager candidates will not be a good fit. If candidates do not have telecom and network experience, they will not be considered. The test lead must have the following skills: • 5+ years of experience in the telecommunications industry • Experienced developing and designing test strategies and test plans and can drive overall testing execution with client • Experience executing and driving Unit, Integration, and User Acceptance Testing and is experienced assisting and demoing with clients • Prior experience with a 4G/5G network roll-out in a cloud-native environment
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    IBM Streams Project
    1 positions Remote
    Skills Required
    Candidate needs these skills,: - Strong Streams SPL programming skills - Strong Streams C++ native operator programming skills - Expert linux/bash skills - Strong understanding of Streams application performance concerns, considerations, etc. - Strong understanding of C/C++/Linux/x86_64 performance concerns, considerations, etc. - Experience with streamsx.network toolkit and/or DPDK libraries a definite plus
    Mode & year
    Telephonic
    Location
    Remote
  • add_circle_outline
    SDET
    1 positions Dallas TX
    Skills Required
    Top Skills Details 1. Design and develop test automation scripts in C# (Strong C# Background) 2. Experience with Report Testing and/or Data Warehouse environment (working with ETL jobs, SQL Server, SSRS) 3. SQL - Understand what developers are doing and following the logic Additional Skills & Qualifications Key question to ask when checking reference is how did they do in making a decision on what to test on the front end vs. not testing. They want someone that can make this decision, not be just told what to do. The most important trait they're looking for is someone that is passionate about learning more about their trade. They want people who curious and push for improvement. Communication skills will be important in this position as well. • Execute a comprehensive automated regression test strategy • Report errors/failures to internal and external parties when needed Reduce the # of defects found by customers and business users in production • Develop and recommend tools to improve quality of software and processes • Strong understanding of Solid Development Principles – BDD, OOP • Experience in an Agile or SAFE scrum environment • API Testing
    Mode & year
    Telephonic
    Location
    Dallas TX

SAY HI TO US

USA

phone (425)533-9681

2018 156th Ave NE
Suite 100 Building F
Bellevue, WA 98007

India

phone 91-40- 66638886

Satya Sadan
3 rd Floor
Saifabad
Hyderabad, TS 500004

Denmark

phone (425)533-9681

Skolevej 7
2980 Kokkedal
Denmark