Transforming Business Through Data Science: Antony Akisetty’s Journey in AI and Analytics

Antony Akisetty is an AI and analytics consultant with a very well impressive electrifying career spanning over 13 years in the fields of healthcare, manufacturing and public safety.
 | 
Antony Akisetty

Antony Akisetty is a distinguished Business and Healthcare Data Scientist with over 13 years of hands on experience on AI solutions and analytics implementation. His experience ranges from undertaking the mammoth task of overhauling the healthcare data systems at Amgen to spearheading modern AI implementation in public safety and Industrial manufacturing. His career boasts exceptional upside to his expansion of big data challenges into business targets, goals as well as into a stream of actionable insights. The efficiencies that emerge from using a more developed analytics can be immense and efficiency is one of the areas that have been addressed in the promotion of Antony Akisetty. Recently, he won the Outstanding Leadership Award at the Marketing 2.0 Conference for having provided enterprise solutions to over 500 clients helping them save over 2 million dollars through automation of various processes.

Q 1: How has your journey from the healthcare and data analysis with Amgen narrowed your focus to the data science pertaining to the manufacturing industry?

A: Amgen has fine-tuned my skills in dealing high volume data streams and real time data management thanks to the accuracy driven work that Ai is accustomed to. With data pipelines working on an SLA of 99.9% and millions of rows being processed in an ETL cycle span, Amgen proved to be instrumental in targeting precision. The productivity achieved was noticeable. There was a steep decrease in patient readmission rates by 20%. The medication adherence rose to 15% and as I entered the data manufacturing sector, I embedded AI driven solutions that were able to decrease errors by 25% resulting into the increase in overall equipment efficiency by 12 %. Due to the accurate methodologies proved useful across real-time manufacturing scenarios.

Q 2: Could you elaborate on your experience with conversational AI and natural language processing?

A: During my professional experience, I have spearheaded the creation and implementation of conversational AI agents that have revolutionized contact centres by lowering service requests from 135 to 63 due to an increase in efficiency by 65%. This included building advanced NLP to SQL pipelines with pandas dataframe agents and employing several RAG methodologies with question-answering chains. Most recently, and prior to that, I was employed at the New York City Fire Department (FDNY), where I oversaw reinforcement of chatbots which enabled paramedics, firefighters, and EMS respondents to make quicker decisions in emergency situations by providing them useful information. It was not enough to merely deploy the technology; it had to be competent enough to grasp the actual user problems and give sensible and relevant responses.

Q 3: How are you able to achieve effective real-time data processing while operating in a manufacture setting?

A: Combining speed and accuracy is critical in achieving effective data processing. While working on Golang and gRPC, I was able to construct and deploy depressed equipment downtime systems within a 30% range, in addition to this, I also constructed real time monitoring systems. I was able to lessen the amount of time taken to process data at WME Group’s by 60%, as I constructed multiple mart real time funnels, and utilized kafka and pyspark techniques on up to 150 million sourced references. While working at Amgen, I was also able to manage real time data pipelines. In my experience, the chief and most important task is to construct systems that are able to process substantial data amounts while always remaining reliable.

Q 4: Were there any contributions made by machine learning in your several quality control modules?

A: In a way yes, I have been able to comprehend data more efficiently due to the incorporation of several modeling systems including, Logistic Regression, Random Forest, and Gradient Boost Trees. At WME Group, in order to inform manufacturing teams on accurate predictions while having them greatly expand their marketing ROI by 18 percent, I also facilitated multiple marketing Mix Modeling platforms proactive measures. The focus has always been greater on comprehensible attributes rather than characteristic models.

Q 5: In what ways are you able to optimize data pipelines?

A: Throughout my career, data pipeline optimization has been one of the best areas that I have continuously focused on. While dealing with complex streams of healthcare data in Amgen , I was able to maintain a staggering 99.9% of the system's average uptime. Currently, I am concurrently handling real-time data pipelines with GenAI that pull more from 30 sources than 150 million. Infrastructure optimization as well as cloud migration helped us make a staggering 30% saving. - The most important for me is the design of scalable frames, efficient in their work, so that as the volumes of incoming data increase, the applications are able to maintain good performance standards but are also extremely reliable. Hence We Aim For The Best Performance Out There.

Q 6: Thanks to AI technologies, many organizations are now able to automate a broad spectrum of business processes. What are your overarching thoughts with regards to MLOps and model deployment in general?

A: Lifecycle management of the model is a task that needs to be addressed quite broadly since there are various silver linings to it. To maintain the integral nature of the model, training, and the data, I have integrated testing activities into CI CD pipelines prior to the deployment phase. My last project with the FDNY enabled me to implement MLOps systems that improved EMS services using predictive models with analytics. As a qualified Scrum Master and Product Owner, I have built environments supportive of the teams' need to be more innovative while upholding high quality, resulting in the productivity of a team going up by 40% while shortening the time required for the allocation of any project by 20%.

Q 7: In what ways do you utilize cloud technologies in your efforts in this field of data science?

A: Cloud technologies in this field have been pivotal in my work in various organizations. I have used GCP services such as BigQuery, VertexAI, and CloudSQL as data storage and processing solutions extensively. At Amgen, I spearheaded the project to switch the data warehouse from Oracle to Redshift which resulted to $678,000 in savings per year. I have also successfully transferred workloads to AWS and GCP with considerable cost savings in infrastructure while tuning it for performance and cost at the same time.

Q 8: How do you approach statistical hypothesis testing and experimentation as well as any other tasks when performing data science?

A: Statistical analysis and rigor in analysis is an integral part of any data science work. I have built and supervised several A/B Tests including their sample size and power analysis. This is related to the time I spent in Kayak and FDNY, as I was extracting relevant information from a number of years’ worth of emergency data to devise useful strategies for resource deployment. The goal is to draw valid statements, on the basis of statistical analysis, which relates to business aspects and future operational issues verifying the possible biases and variance in the stated conclusions.

Q 9: Any thoughts about IoT and Edge Computing in manufacturing?

A: The Internet of Things and edge processing involve some challenges when implemented in a manufacturing setup. Utilizing Golang and gRPC, I have built a distributed backend system for IoT based stock management systems that help to increase stock accuracy by 95% while decreasing stock outs by 30%. The main objective in this regard is to develop architectures that can execute tasks at the edge without compromising on the data integrity and providing the necessary timely data.

Q 10: And in your opinion how do you think Artificial Intelligence and Data Science will change the landscape of manufacturing?

A: According to my analysis, the future of data science should be focusing more on integrated AI capabilities like predictive maintenance and automated quality assurance. Moreover, I expect to see more GenAI – Traditional Analytics integration, even more advanced IoT implementations and an increased focus on automated decision-making across the manufacturing, healthcare and public safety sectors. The only complexity will be the deployment of the mentioned technologies without losing reliability and explainability.

Excellent Application of AI and Analytics – Antony Akisetty

With over 13 years of experience, Antony specializes in applying AI and analytics solutions to the healthcare, manufacturing, and public safety industries. He is a graduate of Quantic School of Business and Technology’s EMBA program and holds an MS in Data Analytics from Southern New Hampshire University. His technical knowledge and business strategy skills are excellent. Through his creativity, he has achieved effective goals such as the 25% reduction of errors incurred during operations with the help of AI, and a 65% enhancement in service operations through conversational AI. The Marketing 2.0 Conference recognized him with the Outstanding Leadership Award recently. He Akisetty has been working relentlessly across industries to utilize the data science tools and techniques that are most suited to drive tangible business value.

Tags