Showing posts with label Microsoft Azure. Show all posts
Showing posts with label Microsoft Azure. Show all posts

Wednesday, January 29, 2025

Looking into Azure AI Foundry: Building a Smart Response System

Over the past few weeks, I had the opportunity to dive deep into Azure AI Foundry and work on poc that combined prompt flow, large language models (LLMs), and database indexing. The idea was to build a system that evaluates different LLMs, retrieves relevant information from a database, selects the right email template, and generates a personalized email response. 

Azure AI Foundry

Azure AI Foundry is a powerful platform designed to simplify the development of AI applications by orchestrating flows involving LLMs, prompts, and Python tools. The best thing about the AI foundry is the visualized graph-based interface for creating flows and its ability to test and debug seamlessly. After setting up my project in the Azure AI Foundry portal, the major part was the setting of Prompt flows.

Designing the Prompt Flow

The first step was to create a Prompt Flow that could evaluate multiple LLMs based on specific input parameters. Here’s how I structured it:

Input Parameters: The flow began by taking user inputs such as query type, historical data context, and additional metadata.

LLM Evaluation: Using Azure OpenAI GPT models, I evaluated several LLMs for their performance on these inputs. This step involved crafting multiple prompt variants using Jinja templating and comparing their outputs.

Index Lookup: Once the best-performing model was selected, I integrated an Index Lookup tool to query a vector database for relevant historical data.

Template Selection: Based on the retrieved data, the system dynamically chose one of several pre-uploaded email templates.

Database Indexing & Retrieval

The Index Lookup tool in Azure AI Foundry made it easy to search through my vector database for relevant results. This tool uses embeddings generated by LLMs to find the most contextually appropriate matches for a given query.

For example:

If the input query was related to customer feedback, the system would retrieve historical feedback records.

For support-related queries, it would fetch relevant support ticket summaries.

This indexing mechanism ensured that every email response was grounded in accurate and relevant data.

Generating the Response

Once the right template was selected, I used the chosen LLM to fill in placeholders in the template with dynamic content. The final email response was not only accurate but also personalized based on historical interactions.

For instance:

A customer asking about delayed shipping would receive an email referencing their previous order details.

A user requesting technical support would get an email tailored to their issue history.

This seamless integration of templates with real-time data retrieval made the system highly effective.

There main areas to look into when turning the POC into a proper use case.. 

1. LLM Evaluation: Comparing multiple LLMs required careful tuning of prompts and iterative testing. What was frustrating at times was that different LLM gave different results for the same query.

2. Data Integration: Ensuring that database indexing worked smoothly with diverse datasets took some effort.

3. Template Management: Designing flexible templates that could adapt to various contexts required creativity. Since I had limited data this impacted the output result.

Final Thoughts

Azure AI Foundry I think can revolutionize workflows by combining retrieval-augmented generation (RAG) techniques with powerful LLMs. The ability to evaluate models dynamically, retrieve relevant data efficiently, and generate personalized outputs has immense potential across different industries —from customer support to marketing automation.


Wednesday, March 15, 2023

Choosing between Azure function App or Azure App Service for your application

This week for building a client-facing mobile app, I had a choice of using either a PaaS or a Serverless implementation on Microsoft Azure This week to create a client-facing mobile application, I had a choice of using either a PaaS or a Serverless implementation on Microsoft Azure Cloud. 


One of the development teams wanted to use an Azure Function, and the other wanted to go with the simple Azure App service plan. It was not the first use case wherein I had to choose between the two services. Serverless technology has evolved and is no longer used just for independent worker-based functionalities. I have used it for several different use cases, including building a part of an e-commerce site running purely on serverless technology. 


I had to take in all the pros and cons and take a logical view to choose between the two. 


Step 1


I started to understand the abilities based on the granularity of the business requirements


What are the requirements?

a.    Business logic - Does the application require a full-fledged middleware with data and business logic?

b.   Lightweight system - Is the application more of a lightweight web API system?

c.    Expected Traffic - What will be the dedicated traffic towards the application?

d.   Code complexity - Does the application require a small piece of code or function for every request?

e.    Performance - Does the application require quick responses?

f.    Scalability- What are the scalability requirements?

g.   Ease of Development/Deployment - What is the level of ease of deployment and development?

h.   Security - What are the security requirements?

i.     Cost - What are the budget or cost requirements?

j.     Application Design - Complexity of the application design?

k.   Governance - Is governance an issue? Are there multiple teams owning and deploying modules to production regularly?

l.     Maintenance - Who will maintain the application, and what are the SLAs?

m. Redundancy- What are the application redundancy requirements?

n.   Technology - Is there a technology restriction?

o.   Learning curve – Technology adaption and learning curve requirements?



Step 2: 


As the requirements are understood, the next step is to understand and map the capabilities and challenges and compare the two services in the context of the development teams.


The Azure function has Three plans -> 1. Consumption 2. Premium, and 3. App service plan


a.    Plan fitment: The consumption plan has cold start issues, less security, and less cost. So, this has to be considered before going with this plan for a production-ready product. Neither cold starts nor security is an issue in building applications with App service.

b.   Cost Issue: Premium and App service plans can avoid cold starts. They also provide better security (with VNET) but at a higher cost. The cost is the same as the App service plan. 

c.    Complexity: With Azure functions, if several moving parts are created, and limited people are maintaining it, then it can get complex. 

d.   Governance issue in the future: If we are building independent decentralized functions, standardization can be an issue in the future.

e.    Latency: If a function in the future needs to wait for another function to execute, then the latency becomes an issue and complexity as well.

f.    Future Business logic: If writing business logic requires building several functions or classes, then an app service integration will be the way to go.

g.   Learning curve and support: It can become a learning curve issue for someone new to maintain and support the applications with multiple functions.


Concluding, both services offer several similar features, In our case, the application was simple and had some future business requirements that required processing with no end-user impact, so the App Service seemed to be an apt fit. However, the choice could be totally different for another set of requirements. 


                                                     

Thursday, December 8, 2022

Demystifying the hidden costs after moving to the Cloud

The web application at a client was hosted using a combination of services on Azure. The architecture was quite simple and used the following services. Front Door, Api Manager, App Service, SQL Database, Service Bus, Redis Cache, and Azure Functions. As the application matured, little did we think of all the hidden costs of the cloud at the start of the project.

Azure Front Door used for efficient load balancing, WAF, Content Delivery Network, and as a DNS. However, the global routing of requests through Microsoft's network incurred data transfer and routing costs. What started as a seamless solution for enhanced user experience turned into a realization that global accessibility came at a price. Also, the complexity of configuring backend pools, health probes, and routing rules can lead to unintended expenses if not optimized.

App Services had a modest cost to begin with on low-scale Premium servers. But as the application garnered a lot of hits, so did the number of users and, subsequently, the resources consumed. The need for auto-scaling to handle increased traffic and custom domains brought unforeseen expenses, turning the initially reasonable hosting costs into a growing concern. So, keep an eye on the server configuration and the frequency of scaling events.

Azure SQL Database brought both power and complexity. Scaling to meet performance demands led to increased DTU consumption and storage requirements. The once manageable monthly expenses now reflected the intricate dance between database size, transaction units, and backup storage. Not scaling down the backups also incurred costs, especially for databases with high transaction rates. Inefficient queries and suboptimal indexing can increase resource consumption, impacting DTU usage and costs.

Azure Service Bus, the messenger between the application's distributed components, began with reasonable costs for message ingress and egress. Yet, as the communication patterns grew, the charges for additional features like transactions and dead-lettering added expenses to the budget. Also, long message TTLs can lead to increased storage costs. 

Azure Cache for Redis, used for in-memory data storage, initially provided high-performance benefits. However, as the application scaled, the usage to accommodate larger datasets, the costs associated with caching capacity, and data transfer began to rise, challenging the notion that performance came without a price. Eviction of data from the cache, may result in increased data transfer costs, especially if the cache is frequently repopulated from the data source. Also, fine-tuning cache expiration policies is crucial to avoid unnecessary storage costs for stale or rarely accessed data.

Lastly, the Azure Functions, with its pay-as-you-go model, was supposed to be the least cost of all services as it allowed to invoke functions as needed. But, the cumulative charges for execution, execution time, and additional resources reminded me that serverless, too, had its hidden cost. Including unnecessary dependencies in your function can inflate execution times and costs.

Demystifying the expenses after moving to Azure required a keen understanding of its pricing models and a strategic approach to balancing innovation with fiscal responsibility.

Building Microservices by decreasing Entropy and increasing Negentropy - Series Part 5

Microservice’s journey is all about gradually overhaul, every time you make a change you need to keep the system in a better state or the ...