Docker enters GenAI development fray with new tools – TechTarget
your123 – stock.adobe.com
Docker threw its hat into the generative AI development ring this week with a set of pre-configured developer tools and a new Docker AI tool that automatically offers developers guidance on container configurations.
Docker is banking on the fact that its containers are already widely used by generative AI model makers and hosting providers such as Hugging Face and OpenAI, as well as its install base of millions of active developer users, said Scott Johnston, CEO of Docker.
“We hear from the developer committee, ‘I was excited about [generative] AI … you can’t miss it today in the headlines,'” Johnston said. “But we hear again and again … ‘How do I get started? What do I need to do? Are these safe technologies? I don’t want to be dependent on an external service. I just want to iterate locally on my laptop.'”
Docker’s GenAI Stack, which shipped this week in the Learning Center of Docker Desktop and became available in a GitHub repository, prepackages a set of tools for application developers to do exactly that, Johnston said.
The GenAI Stack uses Docker Compose to bring together a set of basic development tools, including pre-configured large language models (LLMs) supported by Ollama for local development, a graph database by Neo4j, and the LangChain generative AI development framework. It also includes a pre-built set of reference applications – one a generative AI assistant, and the other a retrieval augmented generation app.
“The intent was, ‘let’s just get the developer to an ‘aha moment’ as quickly as possible that gives them confidence to keep going,'” Johnston said.
Docker also added its own virtual assistant to its suite of developer productivity tools with Docker AI, which automatically generates best practices and recommendations for developers as they configure and debug container-based applications. The tool has links to a new Docker Debug tool released this week, which brings together container and application debugging tools for various programming languages into a centralized workflow.
There are plenty of virtual assistants on the market, Johnston acknowledged, but relatively few cover container files and libraries outside the application itself. Docker AI aims to fill this gap and draw on data gathered from some 20 million daily active developer Docker users to recommend best practices, he said.
It’s early in generative AI development and enterprise adoption of the technology, although experimentation is growing and interest is strong, according to recent research from TechTarget’s Enterprise Strategy Group (see graphic). However, concerns about security and governance are also on the minds of IT pros as they evaluate tools, and while Docker’s GenAI stack plays to these concerns, one analyst wondered about the data used to feed Docker AI.
“It does make a lot of sense that they’re collecting all of that data, but I wonder what happens if the customers now see that you’re monetizing their data, and they go and turn it off, and they’re not sending it home anymore?” said Rob Strechay, lead analyst at enterprise tech media company TheCube. “How does that impact that service?”
Competition in the emerging area of MLSecOps is not yet widespread, but at least one other vendor, JFrog, is also working to bring LLMs and generative AI into its existing DevSecOps tools. Docker also partners with JFrog for its newly generally available Docker Scout product, but there is some overlap between the two vendors’ tools, said Katie Norton, an analyst at IDC.
“JFrog is a little more focused on bringing model development and security close to DevOps processes. … Docker is a bit further left in the development stage before the build,” she said. “JFrog’s capabilities are bringing the binaries into Artifactory and helping them move through the same DevSecOps processes as any other binary.”
As with Docker’s other developer productivity product releases this week, the GenAI Stack potentially saves developers time, which could be valuable for enterprises that want to develop their own generative AI applications, according to Andy Thurai, an analyst at Constellation Research.
“Given that data scientists are not well versed on those [tools], they have to depend on machine learning engineers to set this all up, which can consume a lot of time,” Thurai said. “I see this as an opportunity to produce models from a pre-configured baseline quickly.”
However, Neo4j as the only included vector database within the GenAI Stack represents a potential limitation for some early adopters, according to both Thurai and Strechay. Making it easy to swap in other databases and LLMs besides those supported by Ollama would broaden the GenAI Stack’s appeal, they said.
For now, it’s not clear where Docker will go with the GenAI Stack or how it might address the company’s need to maintain its growth long term, Strechay said.
“I think it’s a great marketing tool,” he said. “But I don’t know how far it gets from being a proof of concept … [or] how far into the CI/CD cycle can you get with Docker alone, versus moving to one of the other tool chains for that?”
Beth Pariseau, senior news writer at TechTarget, is an award-winning veteran of IT journalism. She can be reached at [email protected] or on Twitter @PariseauTT.
Deploying containerized 5G workloads in a cloud environment offers benefits like portability, optimized resource usage and …
Open source PaaS is a good option for developers who want control over application hosting and simplified app deployment, but not…
AWS, Google, IBM and Microsoft offer machine learning certifications that can further your career. Learn what to expect from each…
While plenty of developers entertain the idea of adopting a functional programming model, it’s important to first know exactly …
In this primer on SOLID, we’ll examine the five principles this development ideology embodies, the practices they encourage and …
Every software project proposal requires in-depth research into the technical aspects at play, but the business case for the …
The DevOps toolchain is at an inflection point as enterprises seek to improve collaboration, productivity, efficiency and …
IT ops pros at a SaaS provider made a set of simple changes to log management over the last two years and reaped substantial cost…
Elastic’s observability GM, hired away from Datadog in October, looks to tie the company’s search analytics prowess in with …
Learn how to control state changes as immutable events through the event sourcing model.
The product of every sprint planning session is a sprint goal. Here, we look at its purpose along with how to write a sprint goal…
Learning new concepts is key to any work or life endeavor. Here’s how the Feynman Technique helps maximize learning new concepts …
Many organizations struggle to manage their vast collection of AWS accounts, but Control Tower can help. The service automates …
There are several important variables within the Amazon EKS pricing model. Dig into the numbers to ensure you deploy the service …
AWS users face a choice when deploying Kubernetes: run it themselves on EC2 or let Amazon do the heavy lifting with EKS. See …
All Rights Reserved, Copyright 2006 – 2024, TechTarget
Privacy Policy
Cookie Preferences
Do Not Sell or Share My Personal Information
Recent Comments