In our latest discussion with The Industry Times, Dr. Iván Del Valle offers a candid look into the data-driven backbone of a leading $113B medical technology company. Heading the data & analytics architecture, Dr. Del Valle talks about his team’s push towards integrating multifaceted data sources into a seamless Unified Data Platform (UDP). From tackling the everyday challenges of data accessibility and system integration to using advanced cloud-based solutions for real-time analytics, he shares firsthand accounts of overcoming obstacles and enhancing operational efficiencies. His narrative not only outlines the strategic initiatives underway but also reflects the personal commitment to empowering users with robust analytical tools and data insights that drive informed decision-making across global enterprises.
The Industry Times: Thanks for joining us today, Doctor Iván Del Valle. What is your current role within industry and what are the main projects your team is working on?
Dr. Del Valle: I lead the data & analytics enterprise architecture for leading $113B medical technology company, supporting global supply chain, customer experience, capital equipment, field services, automation COE, manufacturing, quality, R&D, acquisition integration, finance, HR, and legal & compliance. We’re a data product team focused on integrating data from various sources into our Unified Data Platform (UDP) and enabling analytics for our business stakeholders. Our work primarily involves non-operational reporting and multi-region, multi-source analytics. We have two primary data product teams: the global supply chain data product team and the quality data product team. We also handle critical site analytics for SAP, manufacturing, and supply chain.
The Industry Times: Can you share some insights about the tools and platforms your team uses for data management and analysis?
Dr. Del Valle: Absolutely! Our team uses a mix of on-prem and cloud-based platforms. Many of our users use flat files for scheduled or ad-hoc reports. On-prem SQL servers and SAP Hana are also popular. We use a variety of tools and technologies depending on the need and the source. These include AWS EMR, DBT, Glue, DMS, StreamSets, SQL, Snowflake functions and tasks, and some Oracle stored procedures. We also interact with some Azure tools like Power Automate and Power BI, and some on-premise tools like MDS. Additionally, Excel is the go-to analytical tool for almost everyone, but Tableau and PowerBI are also heavily used. We’re gradually moving users from Alteryx to AWS native services or Snowflake.
The Industry Times: What kind of challenges do you and your team face when accessing or using data?
Dr. Del Valle: There are quite a few hurdles. Accessing data easily is a big one. Cleaning data is another major task; it often requires a lot of effort to get it into a usable format. Integrating data from different sources can be tricky, and slow system performance doesn’t help either. Some of us also feel that we don’t have the right tools or enough training to use data effectively. Additionally, maintaining a comprehensive data catalog is an ongoing challenge.
The Industry Times: How self-sufficient do you feel in meeting your data needs?
Dr. Del Valle: Many of us feel pretty confident. I’d say the majority are mostly self-sufficient, meaning we can handle our data needs without much outside help. However, there’s still a notable portion of the team that relies on others, especially for more complex tasks.
The Industry Times: Can you describe the different roles people have in your team when it comes to handling data?
Dr. Del Valle: Sure, we have a variety of roles. There are Data Consumers who review data within dashboards and reports. Data Explorers engage in high-level analysis, while Data Navigators are all about creating visualizations and reports. Insights Miners construct advanced analytical models, and Technical Users focus on building and maintaining data pipelines. We also have Project Managers who ensure the team has the necessary data and solve any issues that arise.
The Industry Times: How often do you need to update or access data?
Dr. Del Valle: Data timeliness is key for us. Many of us need data updated multiple times a day. Daily updates are the most common, but for some applications, near real-time updates are crucial. It’s all about having the right data at the right time to make informed decisions.
The Industry Times: What steps are being taken to improve the situation and address these challenges?
Dr. Del Valle: Our data and analytics team is working on several fronts. They’re focusing on making data more accessible, improving system performance, and providing better training and tools. The goal is to create a more efficient and user-friendly data environment.
The Industry Times: How do you decide which tools and platforms to use for different projects?
Dr. Del Valle: The choice of tools and platforms depends on several factors, including the specific requirements of the project, the data sources involved, and the expertise of the team members. For example, if we need to process large datasets quickly, we might use AWS EMR. For data transformation, DBT and Glue are our go-to tools. If the project requires integrating data from multiple sources, StreamSets and DMS are very useful. Ultimately, we select the tools that best fit the project’s needs and ensure efficient and effective data management.
The Industry Times: Can you explain the process of transitioning users from Alteryx to AWS native services or Snowflake?
Dr. Del Valle: Transitioning users from Alteryx to AWS native services or Snowflake involves several steps. First, we assess the existing workflows and identify the Alteryx processes that need to be migrated. Then, we provide training and support to users to familiarize them with the new tools. We also work on recreating the workflows using AWS services like Glue or Snowflake’s built-in functions. This process ensures that users can perform their tasks seamlessly and take advantage of the enhanced capabilities and performance of AWS and Snowflake.
The Industry Times: What benefits have you observed from using a combination of on-prem and cloud-based platforms?
Dr. Del Valle: Using a combination of on-prem and cloud-based platforms allows us to leverage the strengths of both environments. On-prem solutions provide control and security, which is crucial for certain sensitive data and compliance requirements. Cloud-based platforms, on the other hand, offer scalability, flexibility, and advanced analytics capabilities. This hybrid approach ensures that we can handle a wide range of data management and analytics needs effectively, providing the best of both worlds.
The Industry Times: How do you ensure data security and compliance when using such a diverse range of tools and platforms?
Dr. Del Valle: Data security and compliance are top priorities for us. We implement strict access controls and encryption protocols to protect data both in transit and at rest. Each tool and platform we use is configured to adhere to our security policies and compliance requirements. Regular audits and monitoring help us identify and address any potential vulnerabilities. Additionally, we provide ongoing training to our team members to ensure they are aware of best practices in data security and compliance.
The Industry Times: How do you handle data integration from multiple sources?
Dr. Del Valle: Data integration from multiple sources is a complex task that requires careful planning and execution. We use tools like DMS and StreamSets to facilitate the extraction, transformation, and loading (ETL) process. These tools help us automate data pipelines and ensure data consistency across different sources. We also employ data cataloging and metadata management to keep track of data lineage and ensure that integrated data is accurate and up-to-date.
The Industry Times: What role does automation play in your data management and analytics processes?
Dr. Del Valle: Automation plays a crucial role in our data management and analytics processes. By automating repetitive tasks such as data extraction, transformation, and loading, we can improve efficiency and reduce the risk of human error. Tools like AWS Glue, DBT, and StreamSets allow us to create automated workflows that streamline data processing. Automation also enables us to scale our operations and handle larger volumes of data without compromising on speed or accuracy.
The Industry Times: Can you share an example of a successful project where your team used these tools and technologies?
Dr. Del Valle: One successful project involved integrating data from multiple manufacturing sites to provide a unified view of production metrics. We used AWS EMR for large-scale data processing, Glue for ETL tasks, and Snowflake for data storage and analysis. By creating automated data pipelines, we were able to provide near real-time insights to the production managers. This project not only improved operational efficiency but also helped identify bottlenecks and optimize production processes.
The Industry Times: How do you manage the scalability of your data infrastructure as data volumes grow?
Dr. Del Valle: Scalability is a key concern for us, and we manage it by leveraging cloud-based solutions like AWS and Snowflake. These platforms offer elastic scaling, allowing us to increase or decrease resources based on demand. We also design our data architecture to be modular, so we can add new components or expand existing ones without significant disruptions. Automation tools help us scale our ETL processes efficiently, ensuring that we can handle larger data volumes without compromising performance.
The Industry Times: How do you ensure data quality across different sources and platforms?
Dr. Del Valle: Ensuring data quality involves implementing robust validation and cleansing processes at multiple stages. We use data profiling tools to identify and correct anomalies, and we enforce strict data governance policies. Automated data quality checks are integrated into our ETL pipelines, and we regularly audit our data to ensure consistency. Collaboration with data stewards across different departments also helps in maintaining high data quality standards.
The Industry Times: What role does data visualization play in your analytics strategy?
Dr. Del Valle: Data visualization is crucial for making complex data more accessible and understandable. It allows stakeholders to quickly grasp insights and trends without diving into raw data. We use tools like Tableau and PowerBI to create interactive dashboards and reports, enabling users to explore data and uncover insights visually. Effective visualization helps in better decision-making and enhances communication across the organization.
The Industry Times: Can you talk about any innovative projects your team is working on?
Dr. Del Valle: One of our innovative projects involves implementing machine learning models to predict supply chain disruptions. We’re using AWS SageMaker to build and train these models, which analyze historical data and identify patterns that could indicate potential issues. This project aims to improve our supply chain resilience by enabling proactive measures. It’s an exciting area that combines advanced analytics with practical business applications.
The Industry Times: How do you handle data governance and compliance in a multinational context?
Dr. Del Valle: In a multinational context, data governance and compliance become more complex due to varying regulations. We have a centralized data governance framework that ensures consistency across regions while allowing for local adjustments. Compliance tools and regular audits help us adhere to different legal requirements, such as GDPR in Europe and CCPA in California. Clear policies and training programs ensure that all team members understand and follow these regulations.
The Industry Times: How do you foster a data-driven culture within your organization?
Dr. Del Valle: Fostering a data-driven culture involves encouraging data literacy at all levels. We provide training and resources to help employees understand and use data effectively. Success stories and case studies showcasing the impact of data-driven decisions help in building buy-in. We also promote a mindset of continuous improvement and curiosity, where employees are encouraged to explore data and ask questions. Leadership support is crucial in setting the tone for a data-driven culture.
The Industry Times: What metrics do you use to measure the success of your data initiatives?
Dr. Del Valle: We use a variety of metrics to measure success, including data accuracy, timeliness, and user satisfaction. Key performance indicators (KPIs) such as reduced data processing time, improved decision-making speed, and increased usage of self-service analytics tools are tracked. Business impact metrics, like cost savings and revenue growth attributed to data initiatives, are also important. Regular feedback from users helps us refine and improve our efforts.
The Industry Times: How do you manage the integration of new tools and technologies into your existing data ecosystem?
Dr. Del Valle: Integrating new tools and technologies requires careful planning and execution. We start with a thorough assessment to ensure compatibility and alignment with our strategic goals. Pilot projects help us test the new tools in a controlled environment before full-scale implementation. We provide training and support to ease the transition for users. Continuous monitoring and feedback loops help us address any issues quickly and ensure a smooth integration process.
The Industry Times: What are the key skills you look for when hiring data professionals?
Dr. Del Valle: We look for a combination of technical and soft skills. Technical skills like proficiency in SQL, Python, and data visualization tools are essential. Experience with cloud platforms, ETL processes, and machine learning is also valuable. Soft skills such as problem-solving, critical thinking, and effective communication are crucial for collaborating with different stakeholders. We also value a passion for data and a willingness to continuously learn and adapt to new technologies.
The Industry Times: How do you ensure collaboration between data teams and business stakeholders?
Dr. Del Valle: Ensuring collaboration involves fostering open communication and building strong relationships between data teams and business stakeholders. Regular meetings and workshops help in understanding business needs and aligning data initiatives with strategic goals. We use collaborative tools and platforms to facilitate information sharing and project tracking. Encouraging a culture of mutual respect and understanding ensures that both sides appreciate the value each brings to the table, leading to more effective collaboration.
The Industry Times: How do you stay updated with the latest developments in data management and analytics technologies?
Dr. Del Valle: Staying updated with the latest developments is essential in our field. We regularly participate in industry conferences, webinars, and training sessions. We also encourage our team members to pursue certifications and advanced courses. Additionally, we collaborate with technology vendors and consultants to gain insights into new tools and best practices. Keeping abreast of technological advancements helps us continually improve our data management and analytics capabilities.
The Industry Times: What advice would you give to other organizations looking to enhance their data management and analytics capabilities?
Dr. Del Valle: My advice would be to start by understanding your specific data needs and challenges. Invest in the right tools and technologies that align with your goals. Focus on building a skilled team and provide continuous training to keep them updated with the latest trends. Implement robust data governance and security practices to protect your data assets. Lastly, embrace automation to improve efficiency and scalability. By taking these steps, organizations can significantly enhance their data management and analytics capabilities.
The Industry Times: Any final thoughts on the importance of empowering users with the right tools and data?
Dr. Del Valle: Empowering users is crucial for making informed decisions and driving business success. By addressing the challenges and providing the right support, we can enhance productivity and make data-driven insights more accessible to everyone on the team.
The Industry Times: Anything else that you would like to highlight?
Dr. Del Valle: Yes, my role within academia. As the Americas “Artificial Intelligence & Emerging Technologies Program Director & Head of Apsley Labs” at Apsley Business School London, I lead the WW development of cutting-edge applied AI curricula and certifications. At the helm of Apsley Labs, my aim is to shift the AI focus from tools to capabilities, ensuring tangible business value. I invite the audience to learn more at https://apsley.cloud/apsley-main/ .
The Industry Times: Thank you so much for your time today. Where can people learn more about what you do?
Dr. Del Valle: Thank you so much for the opportunity. I can be reached out at my LinkedIn profile at https://www.linkedin.com/in/enterprise-solutions/ . My published works, spanning from IT Law and the applications of Artificial Intelligence in business could be found on my Amazon author page at https://www.amazon.com/author/ivandelvalle .