Implementation of Generative AI for the Solvedio Skill Management Platform

The integration of a Generative AI solution into the Solvedio platform has transformed traditional skills management into an intelligent, automated, and context-aware system. The solution built on AWS Bedrock has streamlined document processing, skill matrix creation, and team assembly, delivering measurable business benefits for HR and project management.

03. apríl 2025 ┃ 6 minút čítania

aspecta logo

Client

Solvedio is a Digitalization as a Service (DaaS) platform focused on comprehensive business digitalization through a human-centered, no-code approach. Solvedio's solutions are tailored for the manufacturing, HR, and public sectors, aiming to enable fast and accessible digital transformation. Thanks to its speed, flexibility, and low cost, Solvedio has become a key partner for companies looking to modernize their processes and boost productivity.

aspecta logo

Context and Challenges

In recent years, the market for digitalization and HR technologies has undergone a significant transformation. Organizations are increasingly shifting their focus toward intelligent systems that can automatically assess capabilities, optimize work teams, and predict training needs. This trend is driving the massive adoption of cloud and AI solutions.

The main business challenge for the Solvedio platform was the limitation in automatically processing and understanding unstructured data to populate and maintain its skill matrices. Customers needed a solution that could go beyond manual data entry and basic skill tracking. The platform required a significant technological upgrade in order to:

  1. Automatically generate a comprehensive skill matrix directly from various employee documents such as resumes and certificates. The system had to process multiple formats, including PDF, DOCX, TXT, and various types of images.
  2. Intelligently provide recommendations for new skills based on the user's existing profile and documents.
  3. Automated the suggestion of qualified users for specific skill groups or projects, streamlining team formation.
  4. Enabled conversational interaction with the system for searching skills and knowledge using an AI-powered chat model.

 

Without enhancing the platform with generative artificial intelligence, Solvedio faced several long-term business risks:

  1. Stagnating product offering: The platform would lack the competitive differentiation provided by AI-powered features, risking market share in favor of more innovative solutions.
  2. Limited customer value: The burden of manual data entry and analysis would remain on the customer, reducing the platform's value proposition and potentially leading to customer churn.
  3. Scalability limitations: The inability to automate skill extraction would prevent customers from efficiently managing the skills of a large and dynamic workforce, limiting Solvedio’s appeal at the enterprise level.
  4. Operational inefficiency for end users: Customers would continue to struggle with inefficient project staffing, slow identification of expertise, and a lack of strategic insight into their collective skill base.
aspecta logo

Objective

Business Objectives:

  • Automate the generation and updating of skill matrices directly from employee documents.

  • Optimize team assembly using AI recommendations based on skills and availability.

Technical Objectives:

  • Build an intelligent, API-driven platform with generative AI for processing various document formats.

  • Implement a set of secure, asynchronous APIs for long-running task processing.

aspecta logo

Solution

The solution is built on a serverless architecture, allowing AWS to automatically manage scaling and availability. The architecture is designed to be resilient and efficient in handling long-running tasks that are decoupled from the user's initial request. This approach ensures that the failure of a single process does not result in the failure of the entire API request.

Architectural Overview

  • Amazon SQS decouples data ingestion from processing, allowing the system to handle high spikes in incoming requests.

  • Aurora Serverless (PostgreSQL) and S3 serve as the data storage layer, with Aurora functioning as a vector database managed through the Bedrock Knowledge Base, supporting auto-scaling and multi-AZ deployment.

  • Thanks to the fully serverless approach, single points of failure have been eliminated — components such as API Gateway, Lambda, and DynamoDB operate and scale independently.

  • The asynchronous job system ensures that even if a partial task fails, the API remains available and stable.

Task Processing and Management

A separate job is created for each request and recorded in DynamoDB. The table stores processing statuses (pending, streaming, failed, completed). Upon completion, the result is saved back to the record. This approach enables efficient handling of long-running AI requests without overloading the system.

Components and Their Roles

  • Amazon Bedrock Knowledge Base provides a fully managed Retrieval-Augmented Generation (RAG) pipeline, which includes vectorizing source documents using Titan Text Embeddings v2, storing them in Aurora Serverless, and managing queries. A custom RAG implementation would be significantly more time- and resource-intensive.

  • Aurora Serverless (PostgreSQL) was chosen for its automatic scaling and cost optimization compared to OpenSearch Serverless. It is also used for a broader set of functionalities planned for the future.

  • DynamoDB ensures asynchronous processing of AI requests, as a relational database would introduce unnecessary overhead.

  • SQS supports batch processing of new records that need to be ingested, pre-processed, and vectorized — enhancing the system’s resilience. Alternatives like RabbitMQ on EC2 were rejected due to higher operational overhead.

  • AWS Lambda serves as the compute layer for all business logic — its event-driven nature enables it to respond to both API calls and queue messages.

  • EventBridge ensures periodic triggering of Lambda functions for knowledge base synchronization.

  • CloudWatch monitors system performance, error rates, and latency for optimization purposes.

Integrations and Data Flows

  1. SQS → Lambda triggers – for asynchronous batch processing of records.

  2. DynamoDB → Lambda triggers – for processing new AI requests.

  3. Bedrock Knowledge Base ↔ Aurora Serverless – storing vectors generated using Titan Text Embeddings v2.

  4. EventBridge → Lambda – periodic triggering of knowledge base synchronization.

This design ensures robust, scalable, and cost-effective operation without the need for server management, with all components being decoupled and independent.

Each AWS service was selected based on an analysis of alternatives:

  • Bedrock Knowledge Base – provides a fully managed RAG process; a custom implementation would increase complexity and delivery time. – provides a fully managed RAG process; a custom implementation would increase complexity and delivery time.

  • Aurora Serverless (PostgreSQL) – chosen for automatic scaling and lower costs compared to OpenSearch Serverless, leveraging the team's existing expertise.

  • DynamoDB – used to manage long-running AI tasks and track their statuses instead of a relational database, due to its performance and simplicity.

  • SQS – for decoupling batch processing; alternatives like RabbitMQ were rejected due to higher operational overhead.

  • Lambda – event-driven compute layer without the need for server management.

  • EventBridge – schedules periodic tasks for knowledge base synchronization.

Integration:

  1. Lambda triggers from SQS for batch record processing.

  2. Lambda triggers from DynamoDB (INSERT) for AI requests.

  3. Connection between Bedrock KB and Aurora Serverless for vector storage.

  4. EventBridge trigger for synchronizing the knowledge base.

Snimka obrazovky 2025 10 09 135522
Generovanie odporúčaní pre maticu zručností v Solvedio
aspecta logo

Results and benefits

The solution delivered measurable business and technological benefits, demonstrating its optimality in terms of value, cost, and complexity. The solution is optimal in three main areas:

  • Maximized Business Value: By leveraging managed services such as Amazon Bedrock Knowledge Base, the need to build a custom RAG pipeline from scratch was eliminated, significantly accelerating the development and deployment of new AI features for customers.

  • Minimized Costs: A serverless architecture built on AWS Lambda, Amazon SQS, and Amazon DynamoDB enables payment only for actual resource usage, with no costs for idle servers.

  • Reduced complexity: Event-driven architecture simplifies the entire system, making development, debugging, and maintenance easier. Bedrock Knowledge Base – provides a fully managed RAG process; a custom implementation would increase complexity and delivery time. At the same time, it minimizes the complexity of implementing, debugging, and managing a secure and efficient RAG solution.

The solution delivered measurable value in two key indicators that directly impact operational stability and team effectiveness:

  1. Improved project execution through skill-based assignments: Before implementation, approximately 23% of projects were delayed due to misaligned staffing. After deploying the Solvedio platform with AI-based recommendations, the delay rate dropped to 10% within 6 months, representing a more than 50% improvement. The measurement was based on project completion logs and tracking the share of on-time project deliveries before and after solution deployment.

  2. Balanced workload and increased employee satisfaction: Prior to implementation, workload satisfaction was rated at 60 out of 100. After six months of using tools for capacity and skill analysis, it rose to 80 points, exceeding the target of 75. Data was gathered through repeated employee surveys and workload analytics (e.g., percentage of employees at optimal utilization). This led to better task distribution, higher team morale, and a positive impact on retention.

aspecta logo

Conclusion

This case study describes how Generative AI and cloud technologies can fundamentally change the way organizations approach digitization, automation, and data utilization. The integration of advanced AI services into existing processes has created a solution that not only streamlines data processing, but also enables faster decision-making and strategic resource management.

The value of the platform extends beyond HR. It demonstrates how Generative AI can accelerate digital transformation in various segments, from manufacturing and public administration to knowledge management and internal innovation. The Solvedio platform confirms that combining cloud infrastructure, asynchronous processing, and intelligent learning can create a flexible solution ready for future expansion in data analytics, predictive planning, and autonomous decision-making.

The platform proves that Generative AI is becoming a key tool for the digital transformation of organizations, providing the speed, accuracy, and scalability needed to manage complex data ecosystems.