AI Banking App is an interactive web application designed to simulate a modern banking dashboard. Its primary purpose is to serve as an educational tool, demonstrating how SQL-based databases are leveraged across different types of workloads: OLTP, OLAP, and AI-driven analysis.
Through a hands-on interface, users can see the practical difference between writing a new transaction to the database, running complex analytical queries on historical data, and using natural language to ask an AI to query the database for them.
- The Banking Dashboard: A central hub to view account balance and navigate the application.
- Transactions (OLTP): View a real-time list of all past transactions. This demonstrates a typical high-volume, read-heavy OLTP workload.
- Money Transfer (OLTP): Perform transfers between accounts. This showcases a classic atomic, write-heavy OLTP operation that must be fast and reliable.
- Financial Analytics (OLAP): Explore an analytics dashboard with charts and summaries of spending habits. This represents an OLAP workload, running complex, aggregate queries over a large dataset.
- AI Agent (AI Workload):
- Ask questions about your finances in plain English (e.g., "How much did I spend on groceries last month?"). An AI agent translates your question into a SQL query, executes it, and returns the answer.
- Get customer support from using RAG over documents
- Open new account, transfer money in plain English through an action oriented data agent
- A separate service which can be run to capture, in real-time, the operational data generated by the agentic solution and store it to a Fabric SQL database via a prescribed data model.
Layer | Technology |
---|---|
Frontend | React, Vite, TypeScript, Tailwind CSS |
Backend | Python, Flask, LangChain, LangGraph |
Database | Fabric SQL Database |
AI | Azure OpenAI API |
- Node.js (v18 or later)
- Python (v3.9 or later)
- Create two Fabric SQL Databases, one for the customer banking data, another for agentic operational data.
- An Azure OpenAI API Key
- ODBC Driver for SQL Server 18
- Recommend VSCode as tested in VS Code only
- This demo runs currently on a Windows Machine as it support ActiveDirectoryInteractive
git clone https://github.com/mehrsa/Banking_App_SQL_with_DataModel
cd Banking_App_SQL
Before running the application, you need to configure your environment variables. This file stores all the secret keys and connection strings your application needs to connect to Azure and Microsoft Fabric.
Rename the sample file: In the backend directory, find the file named .env.sample and rename it to .env.
Edit the variables: Open the new .env file and fill in the values for the following variables:
You can use the same endpoint to store both your applications logs and operational data.
FABRIC_SQL_CONNECTION_URL: This is the connection string for the Fabric SQL warehouse that will store the application's operational data (e.g., chat history). You can find this in your Fabric workspace by navigating to the SQL-endpoint, clicking the "..." menu, and selecting "Copy SQL connection string."
FABRIC_SQL_CONNECTION_URL_BANK_DATA: This is the connection string for the database containing the sample customer banking data.
AZURE_ENTRA_ID: Your Azure Entra ID (also known as the Directory ID).
AZURE_TENANT_ID: Your Azure Tenant ID.
You can find both of these values in the Azure Portal by navigating to Microsoft Entra ID and looking at the Overview page.
AZURE_OPENAI_KEY: Your API key for the Azure OpenAI service. You can find this in the Azure Portal by navigating to your Azure OpenAI resource and selecting Keys and Endpoint.
AZURE_OPENAI_ENDPOINT: The endpoint URL for your Azure OpenAI service. This is found on the same Keys and Endpoint page in the Azure Portal.
AZURE_OPENAI_DEPLOYMENT: The name of your chat model deployment (e.g., "gpt-5-mini"). This is the custom name you gave the model when you deployed it in Azure OpenAI Studio.
AZURE_OPENAI_EMBEDDING_DEPLOYMENT: The name of your embedding model deployment (e.g., "text-embedding-ada-002").
a. Create a database called customer_banking_data.The schema.sql file in the backend repository contains all the necessary T-SQL commands to create the required tables (users, accounts, transactions) and populate them with sample data.
b. Create another SQL database in Fabric called banking_app for storing the agenitc operational data. Use the agent_data_model.sql to initialize the required tables for the agentic data model.
cd backend python3 -m venv venv .\venv\Scripts\activate # (on Windows) pip install -r requirements.txt
From the root project directory:
npm install
You need to ingest embeddings from the PDF in the SQL Database
a. Copy the .env file in the folder Data_Ingest. b. From the folder Data_Ingest create a new virtual Python environment:
python3 -m venv new_env source new_env/bin/activate # Or .\new_env\Scripts\activate on Windows pip install -r requirements.txt
c. Open the Jupyter Python Notebook in the path: Backend/Documentation ingestion_pdf_Bank_App.ipynb
d. Run all the cells in the notebook
First ensure you have run az login to log into your Azure account.
cd backend
python launcher.py
This will launch two services:
- Banking service on: http://127.0.0.1:5001
- Analytics service on: http://127.0.0.1:5001
(you will be prompeted for credentials during this.)
Go to the root of your folder.
npm run dev
Frontend will run on: http://localhost:5173
Contributions are welcome!
If you have suggestions for improvements or find any bugs, feel free to open an issue or submit a pull request.