RSS 피드 구독하기

In the competitive world of modern finance, staying ahead means embracing innovation. For DenizBank, one of Türkiye's leading private banks, this means a strong commitment to integrating artificial intelligence (AI) and machine learning (ML) into the core of its operations. Driving this technological evolution within DenizBank is Intertech, DenizBank's IT subsidiary, which has played a key role in redefining what's possible in banking.

The challenge: Overcoming barriers to productivity and innovation

Before this modernization, DenizBank's data science teams faced significant hurdles that hindered their ability to innovate quickly. Their workflows were characterized by:

  • Manual and inflexible environments: Over 120 data scientists were using a Virtual Desktop Infrastructure (VDI) for model development. This VDI-based environment was not only slow and resource-intensive, but it also made lifecycle management of continuously changing Python libraries incredibly difficult. It forced them to rely on cumbersome digital workstations where each new model required a complex, manual setup.
  • A lack of standardization: Without a unified platform, consistency was a major issue. Data access methods, model environments, database integrations, and code repositories varied from one data scientist to another, making collaboration and management difficult.
  • Slow time-to-market and deployment: The manual processes created a development bottleneck. It was also taking a significant amount of time to decide which model to deploy, as business teams struggled to understand and compare which models performed better. This indecision meant that critical business opportunities were being missed.
  • Inefficient resource sharing: The IT infrastructure struggled with efficient resource allocation, especially when it came to serving large, complex models.
  • Security and compliance risks: With code stored on local machines within the VDI environment, it was difficult to enforce version control, conduct proper code reviews, or verify a full validation process, posing significant security risks.

The approach: Building a foundation for AI at scale

Intertech recognized that a piecemeal approach wouldn't work. They needed a comprehensive and standardized solution. Their goals were clear: improve time-to-market, reduce AI costs, automate the entire data science pipeline, and empower their teams with self-service capabilities.

The foundation was already in place. With over three years of experience running an enterprise application platform environment—Red Hat OpenShift—Intertech had a deep well of expertise with robust, container-based architectures. The natural next step was to adopt Red Hat OpenShift AI. It was chosen for its powerful ability to leverage Kubernetes for scalability and its GitOps-friendly, code-based approach to infrastructure. Because OpenShift AI is a fully integrated product with OpenShift, the MLOps team adopted GitOps practices to manage the entire AI lifecycle, from experiments to production models, with full trackability.

To ensure success, Intertech collaborated with Red Hat Consulting to design and architect the optimal solution, carefully adapting it to align with their existing DevOps and GitOps best practices.

OpenShift AI in action: What was accomplished

The implementation of OpenShift AI has changed how data scientists at DenizBank work:

  • GitOps-powered automation and governance: The team now has full visibility and control over the entire process. To request a new model development environment, a data scientist simply creates a new Git branch, configures their requirements (project type like ML or GenAI, resources, access rights) in a JSON file, and commits. A pull request is then approved by a platform admin, which automatically triggers OpenShift GitOps (ArgoCD) to provision all the necessary components and configurations. This same auditable, automated process is used for model deployment.
  • Self-service flexibility: Data scientists can now spin up their own tailored AI model development environments in minutes using OpenShift AI workbenches. In this new container-based platform, they have the flexibility and speed to run different images with different sets of libraries for different use cases. They can choose from pre-built, standardized images or create their own, so every project starts from a consistent baseline.
  • Access to leading open source technologies: The platform provides seamless access to a wide array of open source AI technologies, including Kubeflow for notebook management and model serving, and vLLM for model inference.
  • Enhanced security and reliability: With code now housed in centralized repositories, every model undergoes rigorous review and validation, leading to more robust and secure deployments.
  • Optimized GPU usage: Through integration with NVIDIA dashboards and Multi-Instance GPU (MIG) technology, DenizBank can now slice and allocate GPU resources with incredible precision, optimizing usage for both training and serving, and scaling resources automatically as needed.

Measurable success: The compelling results

The impact of this transformation has been tremendous. DenizBank reports:

  • Drastic acceleration in preparation phase: The time needed to prepare a model development environment has been slashed from one week to just 10 minutes. The time required to deploy new microservices models has seen a similar reduction, from several days to minutes.
  • Empowered and productive teams: Over 120 DenizBank data scientists now have the autonomy and standardized tools they need, boosting productivity and allowing them to focus on high-value model creation.
  • Improved risk management: The organization’s enhanced models and quicker deployment cycles have had a direct impact on the bank's bottom line. The delinquent  loan risk was reduced significantly, dropping from 7.51% to 5.79%.
  • Streamlined operations: Automation and a GitOps approach have significantly reduced manual operations and overhead.
  • Efficient resource usage: GPU slicing and distributed workloads have led to optimized resource utilization and cost savings.
  • Better, safer models: DenizBank’s new process has resulted in their creation of more robust, more secure models, directly leading to concrete business outcomes like more accurate customer credit assessments and faster identification of fraudulent financial activity.

Pioneering the future of banking

DenizBank's journey is far from over. The focus now is on migrating all remaining data scientists and models to the OpenShift AI platform. Looking ahead, the bank is prioritizing even more efficient compute resource utilization and enhanced model inferencing capabilities.

By strategically investing in a powerful and flexible AI platform, DenizBank and Intertech have not only solved their immediate challenges but positioned themselves for future AI innovation.

Ready to start your own transformation?

resource

엔터프라이즈를 위한 AI 시작하기: 입문자용 가이드

이 입문자용 가이드에서 Red Hat OpenShift AI와 Red Hat Enterprise Linux AI로 AI 도입 여정을 가속화할 수 있는 방법을 알아보세요.

저자 소개

Erkan Ercan is a seasoned IT professional with over 20 years of experience spanning telecommunications, financial services, and cloud computing. His expertise centers on open-source technologies, OpenShift, MLOps, and Generative AI. In recent years, he has led multiple AI/ML initiatives, many of which have successfully transitioned into real-world OpenShift AI deployments across diverse industries.

Beginning his career as a Software Engineer, Erkan has played a key role in mission-critical projects and digital transformation efforts. Since joining Red Hat Turkey as a Solution Architect in 2019, he has helped organizations across sectors—including telecom, banking, and e-commerce—adopt cloud-native, scalable solutions tailored to their needs.

Erkan is deeply passionate about the intersection of AI, cloud, and open-source innovation. He actively collaborates with development teams, delivers hands-on workshops, and shares his insights at conferences and industry events. His mission is to make cutting-edge technologies not only accessible but also practical and impactful for the organizations he supports.

Read full bio

Will McGrath is a Senior Principal Product Marketing Manager at Red Hat. He is responsible for marketing strategy, developing content, and driving marketing initiatives for Red Hat OpenShift AI. He has more than 30 years of experience in the IT industry. Before Red Hat, Will worked for 12 years as strategic alliances manager for media and entertainment technology partners.

Read full bio
UI_Icon-Red_Hat-Close-A-Black-RGB

채널별 검색

automation icon

오토메이션

기술, 팀, 인프라를 위한 IT 자동화 최신 동향

AI icon

인공지능

고객이 어디서나 AI 워크로드를 실행할 수 있도록 지원하는 플랫폼 업데이트

open hybrid cloud icon

오픈 하이브리드 클라우드

하이브리드 클라우드로 더욱 유연한 미래를 구축하는 방법을 알아보세요

security icon

보안

환경과 기술 전반에 걸쳐 리스크를 감소하는 방법에 대한 최신 정보

edge icon

엣지 컴퓨팅

엣지에서의 운영을 단순화하는 플랫폼 업데이트

Infrastructure icon

인프라

세계적으로 인정받은 기업용 Linux 플랫폼에 대한 최신 정보

application development icon

애플리케이션

복잡한 애플리케이션에 대한 솔루션 더 보기

Virtualization icon

가상화

온프레미스와 클라우드 환경에서 워크로드를 유연하게 운영하기 위한 엔터프라이즈 가상화의 미래