GROW YOUR STARTUP IN INDIA
Copilot generated image by The Tech Panda 5

SHARE

facebook icon facebook icon

The Tech Panda takes a look at recent tech launches.

Data streaming: OEM Program to help partners grow their business with data streaming

Confluent, Inc., the data streaming company, announced the Confluent OEM Program for managed service providers (MSPs), cloud service providers (CSPs), and independent software vendors (ISVs), which makes it easy to launch and enhance customer offerings with a complete data streaming platform for Apache Kafka® and Apache Flink®. With license to globally redistribute or embed Confluent’s enterprise-grade platform, partners can bring real-time products and Kafka offerings to market faster and easily monetize customer demand for data streaming with limited risk. The program makes data streaming a high-margin part of the business with expert implementation guidance and certification to help partners launch enterprise-ready offerings; flexible commercial terms that match the ways partners sell; and ongoing technical support to ensure long-term customer success.

“As data-driven technologies like GenAI become essential to enterprise operations, conversation has shifted from ‘if’ or ‘when’ a business will need data streaming to ‘what’s the fastest, most cost-effective way to get started?’” said Kamal Brar, Senior Vice President, Worldwide ISV and APAC, Confluent. “We help our partners unlock new revenue streams by meeting the growing demand for real-time data within every region they serve. Confluent offers the fastest route to delivering enterprise-grade data streaming, enabling partners to accelerate service delivery, reduce support costs, and minimize overall complexity and risk.”

The Why

The need for real-time data has cemented data streaming as a critical business requirement. According to ISG Software Research, “by 2026, more than three-quarters of enterprises’ standard information architectures will include streaming data and event processing.” To meet this need, teams often turn to popular open source technologies like Kafka and Flink. However, building and maintaining open source software, especially at scale, quickly becomes prohibitively expensive and time-consuming. On average, self-managing Kafka takes businesses more than two years to reach production scale, with ongoing platform development and operational costs exceeding millions of dollars per year. Over time, solutions built with open source Kafka and Flink consume more and more engineering resources, which impacts a business’ ability to focus on differentiation and maintain a competitive advantage.

The Confluent OEM Program alleviates the burdens of self-managing open source technologies while going far beyond just Kafka and Flink. MSPs and CSPs can easily deliver a complete data streaming platform through Confluent, providing a hassle-free solution for unlocking more customer projects across AI, real-time analytics, application modernization and more. ISVs can embed Confluent within their products or applications to cost-effectively power modern customer experiences fueled by real-time data. Confluent simplifies data streaming by eliminating the operational complexities of open source deployments, accelerating delivery times, and ensuring customer success through ongoing expert support. Secure, governed data streams can be available wherever needed—on premises, at the edge, and in the cloud.

Features

  • Design review and development support – Build your data streaming offering with architectural guidance and hands-on development support from Confluent’s team with over 1 million Kafka development hours logged.
  • Speed to market – Accelerate time to value with a complete, ready-to-use data streaming platform including 120+ Kafka connectors, Flink stream processing, enterprise-grade security and data quality controls, and cloud-based monitoring.
  • Confluent certification – Launch confidently with proof that your product or data streaming offering is approved and backed by the industry leader.
  • Flexible commercial terms – Package customer-facing offerings easily with commercial terms that match the way you sell.
  • Expert technical support – Bring committer-led Kafka and Flink support to your business and easily handle any customer question or issue.

Cloud: A try & buy cloud portal for businesses

Rapyder Cloud Solutions, a company in cloud consulting and services, launched Rapyder Tech Studio, a state-of-the-art cutting-edge platform that allows customers to ‘Try & Buy’ Cloud and Generative AI solutions online. This innovative service allows customers to experiment in real-time, seamlessly book a POC and drive innovation faster and smarter. It also simplifies customer interactions by providing easy, efficient, and intuitive way to explore and acquire the solutions they need. With a comprehensive range of cloud and Generative AI products and services suited to the industries, Tech Studio empowers customers to make informed decisions in just a few clicks which will impact their businesses.

“We’re thrilled to launch Rapyder Tech Studio, a platform that empowers our customers to easily try and buy cutting-edge Cloud and Gen AI solutions. This initiative reflects our commitment to enhancing their cloud experience. With Tech Studio, customers can experiment in real-time, tailor solutions to their specific needs, and drive innovation to scale efficiently in today’s competitive market,” – Amit Gupta, Founder & CEO, Rapyder Cloud Solutions.

“Rapyder Tech Studio represents a significant leap for us in offering simplified industry-wise Cloud & GenAI solutions. With our try-and-buy portal, customers can now test, customize, and deploy scalable solutions on demand; making cloud adoption seamless and accelerating the deployment of AI-driven applications across industries,” – Athreya Ramadas, Co-founder & CTO, Rapyder Cloud Solutions.

Features

  • Comprehensive product catalogue with ready to deploy use cases
  • Industry wise segregated solutions
  • Do it yourself trial studios
  • Streamlined online purchasing of cloud solutions
  • A user-friendly interface designed to enhance customer experience

IT: NGINX One, to simplify app security & delivery for dev, ops, & platform teams

F5 (NASDAQ: FFIV) announced the general availability of F5 NGINX One, combining advanced load balancing, web and application server capabilities, API gateway functionalities, and security features in a dedicated package. Customers are now able to simply manage and secure F5 NGINX instances and NGINX Open Source from a single cloud management interface. End-to-end visibility speeds apps to market and enables advanced features like AI more efficiently versus a traditional siloed approach.

“Successful application deployment is a team sport,” said Shawn Wormke, Vice President and General Manager for NGINX at F5. “App delivery and security functions—and corresponding visibility—are often sequestered among individual groups. NGINX One is ideal for modern, ephemeral, and cloud-native app components such as containers and Kubernetes, providing a solution that cost-effectively optimizes, scales, and secures complicated application and API environments across multiple teams.”

The Why

Today’s application teams in the enterprise face the unprecedented difficulty of delivering apps across a wide variety of contexts—from high-performance “bare metal” servers to virtual machines and sprawling Kubernetes clusters, across data centers and in the public cloud. Applying uniform policies for security, compliance, and app delivery configuration has challenged these new widely distributed application architectures. For many organizations, maintaining hybrid and multicloud environments adds considerable operational overhead.

NGINX One improves app security and delivery for development, operations, and platform teams by making it easier to own, optimize, and govern NGINX components in any context. With the NGINX One Console, teams can broadly and easily enforce security policies across the application ecosystem, receive and implement configuration guidance, and automate version and patch updates—all helping to ensure compliance.

Features

  • This new offering makes NGINX technology easier to deploy, unlocking capabilities unmatched by competitors.
  • It consolidates formerly individual offerings such as NGINX Plus into a unified solution, leading to cost savings and simplified deployments.
  • Via the new NGINX One Console, customers can ensure global policy compliance and establish a comprehensive view of NGINX, making it easier for organizations to do the right thing for their teams and their business.
  • Unified visibility across today’s hybrid multicloud environments: Many organizations run both NGINX Plus and NGINX Open Source but have historically not had cross-team views when making changes or adding NGINX instances. NGINX One extends visibility and observability across multiple functions, enabling unified policy enforcement and ensuring that dev teams can scale apps in line with business demands. NGINX One also provides updated visualization tools to clearly monitor and present data to each team in the application delivery workflow, further enhancing compelling use cases such as zero trust initiatives and AI inference solutions.
  • Centralized configuration and management of traffic optimization, security, and scale: As delivering applications becomes more complex, NGINX One provides simplified and centralized management that enables enterprises to innovate more quickly without compromising security or observability. Running alongside F5 Distributed Cloud Services’ growing feature set, the NGINX One Console provides SaaS-based visibility and management so customers can add security and optimization functionality with just a few clicks in an updated GUI. Previously, platform and network operations were often required to perform a series of manual tasks to configure and update instances running in different infrastructure environments.
  • Simplified integration for enhanced app and API performance within the F5 ecosystem: Customers can now benefit from performance advantages across the F5 solution portfolio, putting observability, licensing, and configuration all in one place. This approach enables new app and API security and optimization capabilities to be deployed across both NGINX and Distributed Cloud Services. In addition, NGINX One provides new sets of telemetry and AI capabilities for additional insight into app performance, security, and scaling needs—including surfacing areas for improvement and providing specific recommendations.

Data streaming: Supercharging Apache Flink offering with new developer tools & enterprise-ready security

Confluent, Inc., the data streaming company, introduced new capabilities to Confluent Cloud to make stream processing and data streaming more accessible and secure. Confluent’s new support of Table API makes Apache Flink® available to Java and Python developers; Confluent’s private networking for Flink provides enterprise-level protection for use cases with sensitive data; Confluent Extension for Visual Studio Code accelerates the development of real-time use cases; and Client-Side Field Level Encryption encrypts sensitive data for stronger security and privacy.

“The true strength of using Apache Flink for stream processing empowers developers to create applications that instantly analyze and respond to real-time data, significantly enhancing responsiveness and user experience,” said Stewart Bond, Research Vice President at IDC. “Managed Apache Flink solutions can eliminate the complexities of infrastructure management while saving time and resources. Businesses must look for a Flink solution that seamlessly integrates with the tools, programming languages, and data formats they’re already using for easy implementation into business workflows.”

The Why

More businesses are relying on stream processing to build real-time applications and pipelines for various use cases spanning machine learning, predictive maintenance, personalized recommendations and fraud detection. Stream processing lets organizations blend and enrich their data with information across their business. Apache Flink is the de facto standard for stream processing. However, many teams hit roadblocks with Flink because it’s operationally complex, difficult to secure, and has expensive infrastructure and management costs.

Features

  • Support for Table API enables companies to enhance language flexibility by enabling developers to use their preferred programming languages, taking advantage of language-specific features and custom operations; streamline the coding process by leveraging customers’ integrated development environment (IDE) of choice featuring auto-completion, refactoring tools, and compile-time checks to ensure higher code quality and minimize runtime issues; make debugging easier with an iterative approach to data processing and streamlined CI/CD integration.
  • By enabling private networking for Flink, Confluent users can boost data security and privacy between Flink and Kafka by safeguarding in-transit data and ensuring secure connections between clients and Flink within a private network; simplify secure network configuration, making it easier to set up private connections without requiring extensive networking expertise; facilitate flexible and secure stream processing by seamlessly joining and processing data across different Kafka clusters, ensuring data accessibility while adhering to strict security protocols.
  • Confluent Extension for Visual Studio Code enables teams to streamline topic management to easily create, edit, and browse Kafka topics with intuitive tools that simplify debugging and boost efficiency; code and debug in one place by writing, executing, and debugging Kafka clients, Flink queries, and streaming applications directly in VS Code with enhanced productivity features like code completion; seamlessly manage cloud resources to provision and control Confluent Cloud clusters within VS Code, reducing complexity and streamlining cloud operations.
  • With Client-Side Field Level Encryption teams can improve security of sensitive data and adhere to strict compliance requirements; maintain flexible and granular access control of which specific fields to encrypt; lower total cost of ownership and operational complexity by reducing the need for topic duplication.

Gaming: Launch & publish of two games viz: Warbound & Laser Tanks

DeVC led Felicity Games, a casual game developer and publisher, tied up with AbhiTech Games, a game developing studio, to launch and publish two games viz: Warbound and Laser Tanks. With this partnership AbhiTech Games will gain access to Felicity’s 1 million users while Felicity will aim to grow by 3x and almost double their monthly unique users by Q2 of 2025.

The studio will aim to scale the games with the help of ‘Pokhran’ a proprietary framework which rapidly prototypes and tests casual games in partnership with Indian game developers for commercial viability.

Anurag Choudhary, Founder & CEO of Felicity Games stated, “Through this partnership combining our expertise in casual game publishing with AbhiTech Games’s award-winning game development, we aim to not only expand our audience but also carry forward a shared vision. The innovative Pokhran framework will surely enhance our growth and create unforgettable experiences for players everywhere.”

Abhishek Singh Rana, Indie Developer and Founder at AbhiTechGames also stated that “The Pokhran framework will be a game-changer for us, helping us reach more players, achieve more downloads, and most importantly deliver memorable gaming experiences. Partnering with Felicity Games will allow us to take our games to the next level and grow alongside a company that shares our passion for innovation.”

AI: Tools to stop AI bots from plundering creators’ original content

Cloudflare, Inc. (NYSE: NET), the connectivity cloud company, announced AI Audit, a set of tools to help websites of any size analyze and control how their content is used by artificial intelligence (AI) models. For the first time, website and content creators will be able to quickly and easily understand how AI model providers are using their content, and then take control of whether and how the models are able to access it. Additionally, Cloudflare is developing a new feature where content creators can reliably set a fair price for their content that is used by AI companies for model training and retrieval augmented generation (RAG).

“AI will dramatically change content online, and we must all decide together what its future will look like,” said Matthew Prince, co-founder and CEO, Cloudflare. “Content creators and website owners of all sizes deserve to own and have control over their content. If they don’t, the quality of online information will deteriorate or be locked exclusively behind paywalls. With Cloudflare’s scale and global infrastructure, we believe we can provide the tools and set the standards to give websites, publishers, and content creators control and fair compensation for their contribution to the Internet, while still enabling AI model providers to innovate.”

The Why

Website owners, whether for-profit companies, media and news publications, or small personal sites, may be surprised to learn AI bots of all types are scanning their content thousands of times every day without the content creator knowing or being compensated, causing significant destruction of value for businesses large and small. Even when website owners are aware of how AI bots are using their content, they lack a sophisticated way to determine what scanning to allow and a simple way to take action. For society to continue to benefit from the depth and diversity of content on the Internet, content creators need the tools to take back control.

With AI Audit, Cloudflare aims to give content creators information and take back control so there can be a transparent exchange between the websites that want greater control over their content, and the AI model providers that are in need of fresh data sources, so that everyone benefits.

Features

  • Automatically control AI bots, for free: Cloudflare allows creators and website owners to block all AI bots in one click – putting them back in control. 
  • Tap into analytics to see how AI bots access their content: Every site using Cloudflare now has access to analytics to understand why, when, and how often AI models access their website. (This is often happening thousands of times a day, even on personal blogs).
  • Better protect their rights when negotiating with model providers: For the sites that are signing agreements w/ model providers to license the training and retrieval of content – Cloudflare’s analytics help site owners to audit and understand metrics that are common in these contracts, like the rate of crawling. 
  • Fair Compensation: Over the next year, Cloudflare will help website owners determine the compensation they believe they should receive from AI model providers for the right to scan their content.

AI: Zubin an AI-powered self-service data management software

Data Dynamics, an enterprise data management solution, expanded into India with the establishment of its Centre of Excellence in Pune, Maharashtra. In tandem, the company unveiled Zubin, a groundbreaking AI-powered self-service data management software set to redefine how organizations approach risk management, privacy, sovereignty, optimization, and sustainability. Zubin pioneers an industry-first DIY (Do It Yourself) approach to managing data that puts data ownership, control, and action directly into the hands of data creators.

“In a world consumed by AI use cases and implementation, providing transparency in data management is critical for establishing digital trust between enterprises and their customers,” said Piyush Mehta, CEO of Data Dynamics. “At Data Dynamics, we approach data with the highest level of respect—ensuring that every byte is managed responsibly and that ownership is returned to those who create it. Zubin embodies this philosophy, it fosters a culture of ownership and accountability, placing the power of data management directly in the hands of users. Zubin is THE strategic enabler of digital trust, data sovereignty, and data democracy, guiding organizations through the complexities of this AI age with confidence and clarity. The Pune Center of Excellence brings our data management solutions closer to the demographic that is generating data at a rapid pace, has already tabled a data protection policy, and is at the forefront of IT development and innovation.”

The Why

This launch comes at a critical juncture, perfectly aligned with the worldwide focus on data sovereignty, ethical AI, and data privacy, championing the core principles of digital trust and data democracy. As organizations grapple with the growing influence of AI, projected to add $15.7 trillion to the global economy by 2030, the demand for data privacy is reshaping the landscape, creating both opportunities and risks. Businesses now face the strategic challenge of balancing customer expectations, regulatory obligations, and the need for innovation. The challenge is intensified as 80% of enterprise data is unstructured and unmanaged, making data accuracy a monumental task for AI modeling, particularly given LLMs’ reliance on unstructured data.

Zubin addresses these challenges by empowering organizations to centralize data governance and decentralize data control, enabling central IT to set tailored data policies while simultaneously giving stakeholders at all levels.

IDC’s Spotlight Report on Rethinking Data Security further validates this approach, stating that “Self-service data management tools designed to integrate security capabilities and prioritize privacy and protection will be invaluable for organizations seeking to maximize data value without compromising security.” Click to read the full report.

Features

  • Zubin pioneers an industry-first ‘data democracy by design’ approach using a DIY (Do it Yourself) self-service framework that places data ownership, control, and action directly in the hands of data and application owners. This approach embeds transparency, accountability, and responsible stewardship at the core of an organization’s data management operations.
  • From the C-suite to data owners, the software provides every user with the ability to discover, define, act on, transform, and audit data through an intuitive, self-service, low-code interface. This innovative approach redefines traditional data management, which is complex and siloed. Zubin brings consistency, coherence, and standardization across the organization, delivering granular insights, recommending workflows, automating actions, and ensuring security and governance through role-based access control (RBAC) processes. This creates a unified, seamless data management process and paves the way for a larger vision: empowering Citizen Data Rights.
  • Zubin’s six key capabilities make it indispensable for modern data management: Data Privacy in AI, Risk and Access Management, Data Sovereignty, Data Owner Empowerment, Role Based Views of Your Data, and Data Management in a Hybrid Cloud. These collectively transform how organizations handle their data, ensuring it is secure, compliant, and optimized for the diverse demands of the digital age.

Gen AI: GenAI Gateway platform, powered by AWS, accelerating the company’s safe adoption of generative AI at scale

BT Group’s Digital Unit launched an innovative internal platform to help the company tap into the power of large language models (LLMs) from providers such as Anthropic, Meta, Claude, Cohere, and Amazon. The GenAI Gateway, built in collaboration with AWS and using Amazon Bedrock, Amazon SageMaker and AWS Professional Services capabilities, provides secure, private access to a range of natural-language processing and large language models, a critical tool BT Group will use as it embeds AI into the way it runs the business.

Fabio Cerone, GM EMEA Telco at AWS, said: “The BT Group GenAI Gateway is showcasing how enterprises can effectively deploy generative AI at scale and speed. It’s been a brilliant, pioneering opportunity to collaborate and work backwards from the customer to provide a way to accelerate deployment of generative AI use cases into production with embedded security and compliance. The GenAI Gateway will trigger the flywheel effect in the adoption of generative AI, delivering quicker results for BT Group and its customers.”

Deepika Adusumilli, Managing Director, Data & AI, BT Group’s Digital Unit said: “AI is helping us reimagine the future of our company. We believe that where our data is a constant, we need flexibility with our LLMs. GenAI Gateway allows us to tap into this powerful new set of technologies at scale, in a way that is safe, responsible, flexible and scalable, delivering the ambition we have for AI to unlock the human potential within BT Group, today and in the future.”

The Why

Ad-hoc use of LLMs, whilst appropriate for test and development work, is not well suited to large scale use; cost control, security and privacy need more careful management. LLM performance also needs to be monitored, for unexpected errors (e.g. “hallucinations”) and model decay over time (where LLMs stop behaving as expected).

Features

  • The platform supports prompt security, central privacy controls, billing per use case, enterprise search, and use of multiple corporate data sources to enable the company to be flexible and responsible in the use of different AI models required
  • Ethics and performance are enabled through embedded guardrails that limit the risk of exploitation of the models (“jailbreak risk”)
  • A single consolidated Group platform reduces duplication of effort and resources as BT Group scales and accelerates the adoption of generative AI as APIs, security configuration, infrastructure management etc., can be managed centrally, reducing the risk of error along the way
  • The GenAI Gateway also gives BT Group protection against ‘lock-in’ to any given LLM if any other issues emerge. The use of GenAI Gateway platform will encourage BT Group engineers to use the right model for the right use case, at the right price, as it supports per-use case budget tracking.

Web3: A software development kit to connect Telegram Mini-Apps to Bitget Wallet

Bitget Wallet, a Web3 non-custodial wallet, launched OmniConnect, a software development kit that enables developers to seamlessly connect Telegram Mini-Apps to multichain ecosystems across over a multitude of blockchains including major networks like Solana, TON and all EVM-compatible chains. The integration allows Telegram Mini-Apps to utilize Bitget Wallet for signing and conducting transactions across multiple blockchain networks.

Alvin Kan, COO of Bitget Wallet, highlighted the importance of this development, stating, “Previously, Telegram Mini-Apps could only interact with the TON network, making it difficult to engage with other public chains. Bitget Wallet’s OmniConnect aims to bridge this gap, enabling seamless multi-chain interaction via Bitget Wallet. We’re excited for more developers and blockchain ecosystems to join us in building a more open and thriving Web3 environment on Telegram.”

Features

  • This release signifies a major leap in the integration of Web3 ecosystems with Telegram, offering over a billion Telegram users and developers a simplified, efficient way to interact with multiple blockchains.
  • By integrating with Bitget Wallet, Telegram transforms into a comprehensive gateway to Web3, facilitating a smoother transition from Web2.
  • The Telegram Mini-Apps play a crucial role in onboarding new users to Web3, offering an accessible entry point for individuals who have not previously interacted with decentralized technologies. This aligns with Bitget Wallet’s vision to connect a billion users from social platforms to the entire Web3 world, forming a core part of the broader Bitget Onchain Layer strategy.

Cleantech: India’s first indirect potable water reuse project through managed aquifer recharge

Boson Whitewater, a water utility company that converts STP-treated water into high-quality potable water, has collaborated with Biome Environmental Trust for India’s first indirect potable water reuse project through managed aquifer recharge at Devanahalli, Karnataka. The project produces 6,40,000 litres of potable drinking water per day, adhering to BIS-10500 drinking water standards. The clean water now directly benefits thousands of residents in the Devanahalli municipality.

Vishwanath S, Advisor, Biome Environmental Trust, said, “Devanahalli town relies heavily on deep borewells for its water supply. Through this project, we aim to revive the local lake, recharge groundwater, and explore how a town can become self-sufficient by using both local water sources and treated wastewater. The project has the capacity to meet Devanahalli’s 5.4 MLD (Million litres per day) water demand. In Phase 1, a water treatment plant was installed to provide 240 KL (Kilo litres) of water daily. In Phase 2, the project expanded with the addition of four more filter borewells, a reconstructed 60 KL sump, and a new 400 KLD water treatment plant. The system now delivers 640 KL of water daily, benefiting the Devanahalli residents”.

The Why

The project is part of a broader effort to rejuvenate 65 lakes in Bengaluru by using treated wastewater and rainwater. It involves reviving an old well and digging borewells to access the aquifer, along with the installation of water treatment plants in two phases. Now, the system provides 640 KL of water daily, helping supplement the domestic water requirement of Devanahalli town and its 45,000 residents. This was made possible through the collaboration of several organisations, including Carl Zeiss, Rotary South Parade Bangalore, and the Wipro Foundation.

Features

  • The project is energy-efficient, using just 0.25 units of electricity per 1,000 liters, which is among the lowest in India.
  • It follows the AMRUT 2.0 guidelines and serves as a model for future urban water management.
  • As part of this project, treated wastewater from the sewage treatment plant is first pumped into Bagalur Lake, where it is diluted with rainwater. It is then directed to Devanahalli’s Sihineerukere Lake, further diluted with rainwater, and subsequently filtered through the earth to recharge the aquifer. The water is then picked up from the aquifer through a dug well and shallow filter borewells, treated, and then supplied to the town.
  • Indirect potable reuse involves using an environmental buffer, such as a lake for dilution with rainwater and/ or groundwater aquifer for earth filtration, before the water undergoes final treatment at a drinking water facility.

SHARE

facebook icon facebook icon
You may also like