A A
Name : Akashdeep
Technology: Data Scientist / Machine Learning Engineer
Ready to go onsite
Professional Summary:
• Data Scientist / Machine Learning Engineer with 6+ years of experience in handling Structured and Unstructured data, writing advanced SQL queries, Data Wrangling, Data Visualization, Data Acquisition, Predictive modeling, Probabilistic Graphical Models, Inferential statistics, Data Validation.
• Experience in building robust Machine Learning, Deep Learning and Natural Language Processing Models.
• Expertise in Statistical analysis, Text mining, Supervised learning, Unsupervised Learning, and Reinforcement learning.
• Act as a liaison between client and technical solutions/support groups, using advanced communication skills to elicit, document, analyze and validate client requirements.
• Good knowledge of data analysis, visualization using Power BI and SQL.
• Working knowledge of different relational databases (MS SQL server and PostgreSQL).
• Experienced in Data Analysis, Data Validation, Data cleaning, Data Verification and Data Mismatch identification.
• Experience in gathering business requirements from business users, creating application design including database table design.
• Experience in writing SQL queries, Stored Procedures and Triggers using PostgreSQL, and SQL Server
• Expertise in data visualization, developing Dashboards, Charts, Graphs, and reports using Power BI and Tableau.
• Experience in building robust Machine Learning, Deep Learning and Natural Language Processing Models.
• Expertise in Statistical analysis, Text mining, Supervised learning, Unsupervised Learning, and Reinforcement learning.
• Experience working on Supervised, Unsupervised techniques such as Regression, Classification, Clustering, Machine Learning (ML), Deep Learning (DL).
• Experience with Python Libraries Pandas, Numpy, Seaborn, Matplotlib, NLTK, Spacy, Scikit - learn, Keras and TensorFlow in developing end to end Analytics and ML, DNN models.
• Experience in plotting visuals, building dashboards and storytelling using Tableau, AWS Quick Sight, Matplotlib, Seaborn, Plotly and Power BI.
• Strong mathematical background in Linear algebra, Probability, Statistics, Differentiation and Integration.
• Experience working in AWS environment using S3, Athena, Lambda, AWS Sage maker, AWS Lex, AWS Aurora, Quick Sight, Cloud formation, Cloud Watch, IAM, Glacier, EC2, EMR, Recognition and API Gateway.
• Proficient in Machine Learning techniques (Decision Trees, Linear/Logistic Regressors, Random Forest, SVM, Bayesian, XGBoost, K-Nearest Neighbors) and Statistical Modeling in Forecasting/ Predictive Analytics, Segmentation methodologies, Regression based models, Hypothesis testing, Factor analysis/ PCA, Ensembles.
• Strong experience in Text Mining of cleaning and manipulating text and finding sentiment analysis from text mining data.
• Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
• Having a deep understanding of state-of-the-art machine learning and deep learning algorithms, techniques and best practices.
• Hands-on experience on Azure Cloud and ML API process via Flask.
• Good industry knowledge, analytical & problem-solving skills and ability to work well within a team as well as an individual.
• Expertise in transforming business requirements into analytical models and designs
• Provide support on production issues including troubleshooting, coordination with IT, and end user communication related to data issue
• Excellent Analytical, Communication skills, working in a team and ability to communicate effectively
• Strategic Planning for improving the organization’s efficiency
• Self-starter and able to handle multiple tasks based on priorities.
TECHNICAL SKILLS:
• Languages: SQL, Python, R, C++
• Tools: Tableau, PowerBI, MS Excel, Jupyter Notebook, Google Colab, RStudio, AWS, GCP
• Analytics: Data Cleansing, Data Mining, Data Wrangling, Data Analysis, EDA, PCA
• Python libraries: PyTorch, Nltk, OpenCV, TensorFlow, Selenium, Pandas, NumPy, Scikit-learn, bs4
• Statistical Modelling: Regression, Classification, A/B Testing, Clustering, Segmentation.
• Data Visualization : AWS Quick sight, Tableau, MS Power BI, Seaborn, Plotly , ggplot2, RShiny.
• ML algorithms : Reinforcement Learning, Decision Tree, Random Forest, Regression models, Naïve Bayes, KNN, Kmeans, SVM, Clustering algorithms, XGBoost
• Deep Learning: Neural Networks, RNN, LSTM, NLP, Recommender Systems
• IDE: PyCharm, Visual Studio, Microsoft SQL Server Management Studio
• Databases and Tools: SQL Server, PostgreSQL, Redshift, Teradata, Snowflake, Hive, Kafka.
• Operating Systems: Windows, Mac, UNIX, LINUX
PROFESSIONAL EXPERIENCE:
Client: Maximus, TX Mar 2020 – Present
Role: Data Scientist/ML Engineer
Responsibilities:
• Performed data manipulation, data preparation, normalization, and predictive modeling. Improve efficiency and accuracy by evaluating models in Python and R.
• Analyzed and solved business problems and found patterns and insights within structured and unstructured data.
• Bulk loading from the external stage (AWS S3), internal stage to snowflake cloud using the COPY command.
• Loading data into snowflake tables from the internal stage using snowsql.
• Writing complex snowsql scripts in snowflake cloud data warehouse to business analysis and reporting
• Built Customer Lifetime Value prediction model using XGBoost gradient boosting algorithm on customer attributes like customer demographics, tenure, age, revenue, retirement plans etc.
• Predicted the probability of customer loan default by building a robust Artificial Neural Network classifier in the same ML - pipeline of LTV which helped to detect churn in advance.
• Performed Spatial Analysis, Spatial clustering using python and visualized the finding in geographical clusters and heat maps using Plotly and Bokeh.
• Explored and analyzed the customer specific features by using Matplotlib, Seaborn, GGplot in Python and built dashboards in Tableau.
• Conducted Data blending, Data preparation using SQL for tableau consumption and publishing data sources to Tableau server.
• Used SNOW PIPE for continuous data ingestion from the S3 bucket
• Performed data wrangling, data imputation and EDA using pandas, Numpy, Sklearn and Matplotlib in Python.
• Implemented classification algorithms such as Logistic Regression, K-NN neighbors and Random Forests to predict the Customer churn.
• Experimented and built predictive models including ensemble methods such as Gradient boosting trees and Neural Network by Keras to predict insurance rates.
• Hands-on experience with Big Data tools like Hive, Apache Flume and Kafka.
• Addressed overfitting and underfitting by tuning the hyper parameter of the machine learning algorithms by using Lasso and Ridge Regularization.
• Experimented with Ensemble methods to increase the accuracy of the training model with different Bagging and Boosting methods.
• Developed of data collection processes and data management systems maintenance of data integrity.
• Loaded data from Hadoop and made it available for modeling in Keras.
• Transformed raw data into MySQL with custom-made ETL application to prepare unruly data for machine learning.
Environment: Power BI, Python, SQL Server, EDA, pandas, NumPy, Sklearn, Matplotlib, Seaborn, GGplot, Keras, Snowflake.
Name: Ram
Technology: Devops Engineer
Ready to work hybrid
Professional Summary:
Overview:
DevOps Engineer with overall 6 Years of Hands-on experience and understanding of DevOps methodology and workflow, Continuous Integration/continuous delivery (CICD) oriented Build & Release of code and solution-based IT services for Linux systems, Containerization, Configure Management, Cloud Services, and System Administration.
Expertise:
• Expertly worked on Hudson and Jenkins for all the Build and Deployment Processes for End-To-End automation and Continuous Integration.
• In-depth Knowledge and best practices of Software Development Life Cycle (SDLC) & Software Configuration Management (SCM) in Agile, Scrum & Waterfall methodologies.
• Best experienced in the installation, configuration, and upgrading of RHEL 5.x/6.x/7.x, CentOS 5.x/6.x/7.x, Ubuntu, Debian by the usage of Kick Start server.
• Experienced with implementing monitoring solutions in Ansible, Terraform, Docker, and Jenkins.
• Experienced in using the windows and Linux environments for creating Branches, Rebasing, Reverting, Merging, Tagging and maintaining the versions across the Environments by using SCM tools like GIT.
• Involved in designing and deploying a multitude of cloud services on AWS stack such as EC2, Route53, S3, RDS, DynamoDB, and IAM, while focusing on high availability, fault tolerance, and auto-scaling.
• Expertise in setting up Kubernetes (k8s) clusters for running microservices and deploying them into a Production environment.
• Expertise in Terraform for building, changing, and versioning infrastructure and collaborating on the automation of AWS Infrastructure via Terraform and Jenkins.
• Worked on VAULT, a Secret Manager tool for storing Secrets, Key-Value pairs, and other security parameters.
• Good Knowledge in Querying RDBMS such as Oracle, MY SQL, and SQL Server by using SQL for data integrity.
• Provided expertly in Kafka brokers, zookeepers, KSQL, Kstream and Kafka control center.
• Expertise in setting up Kubernetes (k8s) clusters for running microservices and deploying them into the Production environment.
• Experienced using log monitoring tools like Prometheus, Splunk, Nagios, and ELK (Elasticsearch, Log Stash, and Kibana) to see log information. Monitor & get the health & security notifications from nodes. Created monitors, alarms & notifications for EC2 hosts using CloudWatch. Implemented Kubernetes to deploy scale, load balance, scale and manage Docker containers with multiple names spaced versions.
• Good working experience with DevOps tools such as Vagrant, Virtual Box, Chef, Puppet, Ansible, Maven, ANT, SVN, GIT, Azure, Jenkins, and Docker.
• Expertise in Integration, Tuning, Backup, Disaster Recovery, Upgrades, Patching, Monitoring System Performance, System and Network Security, and troubleshooting of Linux/Unix Servers.
• Hand-On experience in implementing, Building, and Deployment of CI/CD pipelines, managing projects often includes tracking multiple deployments across multiple pipeline stages (Dev, Test/QA staging, and production).
• Ability and experience to meld development and operations to ensure quick delivery of an effective and efficient end product.
• Adapt to new, evolving technologies and implement them in current projects. Good interpersonal skills, quick learning, problem resolving, and meeting technical and business needs.
Technical Skills:
Operating Systems
RHEL/Centos 5.x/6.x/7, Ubuntu/Debian, Sun Solaris 7/8/9/10, Windows Server 2003/2008/2012
Cloud Technologies
AWS EC2, IAM, AMI, Elastic Load Balancer (EBL), DynamoDB, S3, SNS, Cloud formation, Route53, VPC, VPN, Security groups, Cloud watch, EBS, Athena, EWR
Build Tools
ANT, MAVEN
Configuration Management Tools
Puppet, Chef, Ansible
CI Tools
Jenkins/Hudson, Anthill Pro, UDeploy
Scripting:
Python, Bash.
Application Servers:
Apache Tomcat, WebLogic, WebSphere
Languages/Scripts
C, HTML, Shell, Bash, PHP, Python, PHP, Ruby, Perl, and Power Shell.
Web Server:
Apache, Nginx
SDLC
Agile, Scrum, Waterfall.
RDBMS
Oracle, SQL Server, MS Access, MySQL 5.x, PostgreSQL
Database
MS SQL, ORACLE SQL, MYSQL
Education:
Masters in IT from Wilmington University
Work Experience:
Client: Dell TX March 2021 – Present
Role: DevOps Engineer
Responsibilities:
• Extensively working with Kubernernets to orchestrate the deployment, Scaling, and management of Docker Containers.
• Developed and implemented Software Release Management strategies for various applications as per the agile process.
• Setup of Kubernetes services on AWS with Route53 and worked on encrypting TLS certificates.
• Work in setting up the CI/CD pipelines using Jenkins, GitHub, GitOps, Helm, and AWS.
• Performed AWS Cloud deployments for web applications running on AWS Elastic Beanstalk with monitoring using CloudWatch and VPC to manage network configurations.
• Provided Expertise and hands-on experience on custom connectors using the Kafka core concepts and API.
• Created S3 Buckets in AWS and stored files. Enabled Versioning and security for files stored. Manage AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing, and Glacier for our QA and UAT environments as well as infrastructure servers for GIT and Chef.
• Managing Amazon Web Services (AWS) infrastructure with automation and configuration management tools such as Puppet, Chef, cloud-hosted solutions, and specific AWS product suite experience
• Troubleshoot and resolve issues with VMWare systems hosting Windows and Linux Operating systems.
• Worked with a complex environment on Red Hat Linux and Windows Servers while ensuring that these systems adhere to organizational standards and policies.
• Setup of Kubernetes services on AWS with Route53 and worked on encrypting TLS certificates.
• Identify, troubleshoot, and resolve issues related to the build and deploy process.
• Owning critical infrastructure components or systems, and continuously working to improve them.
• Diving deep to resolve problems at their root, looking for failure patterns and driving resolution.
• Ensuring stability, reliability, and performance of AWS infrastructure.
• Deployed application which is containerized using Docker onto a Kubernetes cluster which is managed by Amazon Elastic Container Service for Kubernetes (EKS).
• Extensively worked with Scheduling, deploying, and managing container replicas onto a node using Kubernetes and experienced in creating Kubernetes clusters work with Helm charts running on the same cluster resources.
• Developed and supported the RHEL-based infrastructure in the cloud environment.
• Using Jenkins pipelines to drive all microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods, and managed using Kubernetes
• Monitoring and managing Kibana logs on Environments like Dev, QA, Prod
• Provided configuration services on multiple platforms in the test environment running on one or more Platforms like Maven, Client/server, Jenkins, MS Build, Microsoft Windows NT, and UNIX.
• Established Chef Best practices approach to system deployment with tools with vagrant and managing Chef Cookbook as a unit of software deployment and independently version controlled
• Worked on the creation of puppet manifest files to install tomcat instances and to manage configuration files for multiple applications
• Designed and developed the tools to allow efficient configuration management, build, and release of software developed in J2EE, XML, and DB2 databases.
• Developing procedures to unify, streamline and automate application development and deployment procedures with Linux container technology using Dockers.
• Deployed the build artifacts into environments like QA, UAT, and production according to the build life cycle
• Coordinated the resources by working closely with Project Managers for the release and carried deployments and builds on various environments using a continuous integration tool
Environment: Jenkins, Terraform, AWS, EC2, Route53, S3, VPC, EBS, Auto scaling, Kubernetes, Helm, Elastic Search, Kibana, FluxCD/ArgoCD, GIT, AWS, Unix/ Linux environment, bash scripting, GitHub Actions.
Thanks & Regards
Jyothi
Sales Recruiter
TechSmart Global Inc.
666 Plainsboro Rd, Suite 1116, Plainsboro, NJ 08536.