This post is intended for businesses and other organizations interested... Read more →
Posted by Richy George on 3 October, 2018This post was originally published on this site
Palo Alto Networks launched in 2005 in the age of firewalls. As we all know by now, the enterprise expanded beyond the cozy confines of a firewall long ago and vendors like Palo Alto have moved to securing data in the cloud now too. To that end, the company announced its intent to pay $173 million for RedLock today, an early-stage startup that helps companies make sure their cloud instances are locked down and secure.
The cloud vendors take responsibility for securing their own infrastructure, and for the most part the major vendors have done a decent job. What they can’t do is save their customers from themselves and that’s where a company like RedLock comes in.
As we’ve seen time and again, data has been exposed in cloud storage services like Amazon S3, not through any fault of Amazon itself, but because a faulty configuration has left the data exposed to the open internet. RedLock watches configurations like this and warns companies when something looks amiss.
When the company emerged from stealth just a year ago, Varun Badhwar, company founder and CEO told TechCrunch that this is part of Amazon’s shared responsibility model. “They have diagrams where they have responsibility to secure physical infrastructure, but ultimately it’s the customer’s responsibility to secure the content, applications and firewall settings,” Badhwar told TechCrunch last year.
Badhwar speaking in a video interview about the acquisition says they have been focused on helping developers build cloud applications safely and securely, whether that’s Amazon Web Services, Microsoft Azure or Google Cloud Platform. “We think about [RedLock] as guardrails or as bumper lanes in a bowling alley and just not letting somebody get that gutter ball and from a security standpoint, just making sure we don’t deviate from the best practices,” he explained.
“We built a technology platform that’s entirely cloud-based and very quick time to value since customers can just turn it on through API’s, and we love to shine the light and show our customers how to safely move into public cloud,” he added.
He believes that customers will benefit from RedLock’s compliance capabilities being combined with Palo Alto’s analytics capabilities to provide a more complete cloud security solution. It will also fit nicely with Evident.io, a cloud infrastructure security startup, the company acquired in March for $300 million.
RedLock launched in 2015 and has raised $12 million. The $173 million purchase would appear to be a great return for the investors who put their faith in the startup.
Posted by Richy George on 2 October, 2018This post was originally published on this site
Foursquare has today announced the partial close of a $33 million Series F financing, with $25 million already closed out and another $8 million inbound, according to the blog post.
The round was co-led by Simon Ventures and Naver Corp, with participation from Union Square Ventures, an existing investor.
Over the past four years, Foursquare has pivoted from a consumer-facing social application to an enterprise platform, giving brands, retailers and ad platforms a way to get accurate, location-based data about their customers and their conversion rates.
Foursquare CEO Jeff Glueck told TechCrunch that more than 90 percent of Foursquare’s revenue comes from the enterprise side of the business. Two of the company’s most popular products are Attribution and the Pilgrim SDK.
With Attribution, Foursquare allows retailers and publishers to effectively track the impact their media has on conversion at offline locations. Using a panel of 25 million, non-incentivized users, these brands and retailers can track their own impact, as well as make more informed campaign decisions using insights around foot traffic and visit history of certain demographics.
The Pilgrim SDK, on the other hand, allows brands and partners to deliver highly relevant notifications and other experiences to their own users by leveraging Foursquare’s troves of location data.
Foursquare customers include Tinder, AccuWeather, Spotify, Hilton and iHeartMedia, and that doesn’t include the long list of brands — Uber, Apple, Microsoft, Samsung and Twitter — whose platforms are powered by Foursquare location.
According to Glueck, one of Foursquare’s greatest advantages is that they can offer the same high-level capabilities as their competitors, such as Facebook and Google, while focusing solely on the value they’re delivering to partners.
“The success of Google or Facebook or Amazon makes them great companies but unreliable partners,” said Glueck. “The truth about these walled gardens is that they can change their terms and conditions on a whim. They’re not partner-oriented. They’re seeking domination. It’s important for an independent developer community to be able to partner with a company that has the same capabilities.”
Foursquare currently includes more than 100 million places in more than 150 countries on their platform, which powers apps that collectively serve more than 1 billion consumers.
This latest round, which increased the company’s valuation, brings Foursquare’s total funding to $240 million.
Posted by Richy George on 2 October, 2018This post was originally published on this site
Apple Business Chat launched earlier this year as a way for consumers to communicate directly with businesses on Apple’s messaging platform. Today the company announced it was expanding the program to add new businesses and support for additional countries.
When it launched in January, business partners included Discover, Hilton, Lowe’s and Wells Fargo. Today’s announcement includes the likes of Burberry, West Elm, Kimpton Hotels, and Vodafone Germany.
The program, which remains in Beta, added 15 new companies today in the US and 15 internationally including in the UK, Japan, Hong Kong, Singapore, Canada, Italy, Australia and France.
Since the launch, companies have been coming up with creative ways to interact directly with customers in a chat setting that many users prefer over telephone trees and staticy wait music (I know I do).
For instance, Four Seasons, which launched Business Chat in July, is expanding usage to 88 properties across the globe with the ability to chat in more than 100 languages with reported average response times of around 90 seconds.
Apple previously added features like Apple Pay to iMessage to make it easy for consumers to transact directly with business in a fully digital way. If for instance, your customer service rep helps you find the perfect item, you can purchase it right then and there with Apple Pay in a fully digital payment system without having to supply a credit card in the chat interface.
What’s more, the CSR could share a link, photo or video to let you see more information on the item you’re interested in or to help you fix a problem with an item you already own. All of this can take place in iMessage, a tool millions of iPhone and iPad owners are comfortable using with friends and family.
To interact with Business Chat, customers are given messaging as a choice in contact information. If they touch this option, the interaction opens in iMessage and customers can conduct a conversation with the brand’s CSR, just as they would with friends.
This link to customer service and sales through a chat interface also fits well with the partnership with Salesforce announced last week and with the company’s overall push to the enterprise. Salesforce president and chief product officer, Bret Taylor described how Apple Business Chat could integrate with Salesforce’s Service Bot platform, which was introduced in 2017 to allow companies to build integrated automated and human response systems.
The bots could provide a first level of service and if the customer required more personal support, there could be an option to switch to Apple Business Chat.
Apple Business Chat requires iOS 11.3 or higher.
Posted by Richy George on 2 October, 2018This post was originally published on this site
Empires rise and fall, and none more so than business empires. Whole industries that once dominated the planet are just a figment in memory’s eye, while new industries quietly grow into massive behemoths.
New York City has certainly seen its share of empires. Today, the city is a global center of finance, real estate, legal services, technology, and many, many more industries. It hosts the headquarters of roughly 10% of the Fortune 500, and the metro’s GDP is roughly equivalent to that of Canada.
So much wealth and power, and all under constant attack. The value of technology and data has skyrocketed, and so has the value of stealing and disrupting the services that rely upon it. Cyber crime and cyber wars are adding up: according to a report published jointly between McAfee and the Center for Strategic and International Studies, the costs of these operations are in the hundreds of billions of dollars – and New York’s top industries such as financial services bare the brunt of the losses.
Yet, New York City has hardly been a bastion for the cybersecurity industry. Boston and Washington DC are far stronger today on the Acela corridor, and San Francisco and Israel have both made huge impacts on the space. Now, NYC’s leaders are looking to build a whole new local empire that might just act as a bulwark for its other leading ecosystems.
Today, the New York City Economic Development Corporation (NYCEDC) announced the launch of Cyber NYC, a $30 million “catalyzing” investment designed to rapidly grow the city’s ecosystem and infrastructure for cybersecurity.
James Patchett, CEO of NYCEDC, explained in an interview with TechCrunch that cybersecurity is “both an incredible opportunity and also a huge threat.” He noted that “the financial industry has been the lifeblood of this city for our entire history,” and the costs of cybercrime are rising quickly. “It’s a lose-lose if we fail to invest in the innovation that keeps the city strong” but “it’s a win if we can create all of that innovation here and the corresponding jobs,” he said.
The Cyber NYC program is made up of a constellation of programs:
In addition to Facebook, other companies have made commitments to the program, including Goldman Sachs, MasterCard, PricewaterhouseCoopers, and edX.org. Two Goldman execs, Chief Operational Risk Officer Phil Venables and Chief Information Security Officer Andy Ozment, have joined the initiative’s advisory boards.
The NYCEDC estimates that there are roughly 6,000 cybersecurity professionals currently employed in New York City. Through these programs, it estimates that the number could increase by another 10,000. Patchett said that “it is as close to a no-brainer in economic development because of the opportunity and the risk.”
To tackle its ambitious cybersecurity goals, the NYCEDC is partnering with two venture firms, Jerusalem Venture Partners (JVP) and SOSA, with significant experience investing, operating, and growing companies in the sector.
Jerusalem-based JVP is an established investor that should help founders at Hub.NYC get access to smart capital, sector expertise, and the entrepreneurial experience needed to help their startups scale. JVP invests in early-, late-, and growth-stage companies focused on cybersecurity, big data, media, and enterprise software.
Erel Margalit, who founded the firm in 1993, said that “If you look at what JVP has done … we create ecosystems.” Working with Jerusalem’s metro government, Margalit and the firm pioneered a number of institutions such as accelerators that turned Israel into an economic powerhouse in the cybersecurity industry. His social and economic work eventually led him to the Knesset, Israel’s unicameral legislature, where he served as an MP from 2015-2017 with the Labor Party.
Israel is a very small country with a relative dearth of large companies though, a huge challenge for startups looking to scale up. “Today if you want to build the next-generation leading companies, you have to be not only where the ideas are being brewed, but also where the solutions are being [purchased],” Margalit explained. “You need to be working with the biggest customers in the world.”
That place, in his mind, is New York City. It’s a city he has known since his youth – he worked at Moshe’s Moving IN NYC while attending Columbia as a grad student where he got his PhD in philosophy. Now, he can pack up his own success from Israel and scale it up to an even larger ecosystem.
Since its founding, JVP has successfully raised $1.1 billion across eight funds, including a $60 million fund specifically focused on the cybersecurity space. Over the same period, the firm has seen 32 successful exits, including cybersecurity companies CyberArk (IPO in 2014) and CyActive (Acquired by PayPal in 2013).
JVP’s efforts in the cybersecurity space also go beyond the investment process, with the firm recently establishing an incubator, known as JVP Cyber Labs, specifically focused on identifying, nurturing and building the next wave of Israeli cybersecurity and big data companies.
On average, the firm has focused on deals in the $5-$10 million range, with a general proclivity for earlier-stage companies where the firm can take a more hands-on mentorship role. Some of JVP’s notable active portfolio companies include Source Defense, which uses automation to protect against website supply chain attacks, ThetaRay, which uses big data to analyze threats, and Morphisec, which sells endpoint security solutions.
The self-described “open-innovation platform,” SOSA is a global network of corporations, investors, and entrepreneurs that connects major institutions with innovative startups tackling core needs.
SOSA works closely with its partner startups, providing investor sourcing, hands-on mentorship and the physical resources needed to achieve growth. The group’s areas of expertise include cybersecurity, fintech, automation, energy, mobility, and logistics. Though headquartered in Tel Aviv, SOSA recently opened an innovation lab in New York, backed by major partners including HP, RBC, and Jefferies.
With the eight-floor Global Cyber Center located in Chelsea, it is turning its attention to an even more ambitious agenda. Uzi Scheffer, CEO of SOSA, said to TechCrunch in a statement that “The Global Cyber Center will serve as a center of gravity for the entire cybersecurity industry where they can meet, interact and connect to the finest talent from New York, the States, Israel and our entire global network.”
With an already established presence in New York, SOSA’s local network could help spur the local corporate participation key to the EDC’s plan, while SOSA’s broader global network can help achieve aspirations of turning New York City into a global cybersecurity leader.
It is no coincidence that both of the EDC’s venture partners are familiar with the Israeli cybersecurity ecosystem. Israel has long been viewed as a leader in cybersecurity innovation and policy, and has benefited from the same successful public-private sector coordination New York hopes to replicate.
Furthermore, while New York hopes to create organic growth within its own local ecosystem, the partnerships could also benefit the city if leading Israeli cybersecurity companies look to relocate due to the limited size of the Israeli market.
While we spent comparatively less time discussing them, the NYCEDC’s educational programs are particularly interesting. Students will be able to take classes at any university in the five-member consortium, and transfer credits freely, a concept that the NYCEDC bills as “stackable certificates.”
Meanwhile, Facebook has partnered with the City University of New York to create a professional master’s degree program to train up a new class of cybersecurity leaders. The idea is to provide a pathway to a widely-respected credential without having to take too much time off of work. NYCEDC CEO Patchett said, ”you probably don’t have the time to take two years off to do a masters program,” and so the program’s flexibility should provide better access to more professionals.
Together, all of these disparate programs add up to a bold attempt to put New York City on the map for cybersecurity. Talent development, founder development, customer development – all have been addressed with capital and new initiatives.
Yet, despite the time that NYCEDC has spent to put all of these partners together cohesively under one initiative, the real challenge starts with getting the community to participate and build upon these nascent institutions. “What we hear from folks a lot of time,” Patchett said to us, is that “there is no community for cyber professionals in New York City.” Now the buildings have been placed, but the people need to walk through the front doors.
The city wants these programs to be self-sustaining as soon as possible. “In all cases, we don’t want to support these ecosystems forever,” Patchett said. “If we don’t think they’re financially sustainable, we haven’t done our job right.” He believes that “there should be a natural incentive to invest once the ecosystem is off the ground.”
As the world encounters an ever increasing array of cyber threats, old empires can falter – and new empires can grow. Cybersecurity may well be one of the next great industries, and it may just provide the needed defenses to ensure that New York City’s other empires can live another day.
Posted by Richy George on 29 September, 2018This post was originally published on this site
The Pentagon is going to make one cloud vendor exceedingly happy when it chooses the winner of the $10 billion, ten-year enterprise cloud project dubbed the Joint Enterprise Defense Infrastructure (or JEDI for short). The contract is designed to establish the cloud technology strategy for the military over the next 10 years as it begins to take advantage of current trends like Internet of Things, artificial intelligence and big data.
Ten billion dollars spread out over ten years may not entirely alter a market that’s expected to reach $100 billion a year very soon, but it is substantial enough give a lesser vendor much greater visibility, and possibly deeper entree into other government and private sector business. The cloud companies certainly recognize that.
That could explain why they are tripping over themselves to change the contract dynamics, insisting, maybe rightly, that a multi-vendor approach would make more sense.
One look at the Request for Proposal (RFP) itself, which has dozens of documents outlining various criteria from security to training to the specification of the single award itself, shows the sheer complexity of this proposal. At the heart of it is a package of classified and unclassified infrastructure, platform and support services with other components around portability. Each of the main cloud vendors we’ll explore here offers these services. They are not unusual in themselves, but they do each bring a different set of skills and experiences to bear on a project like this.
It’s worth noting that it’s not just interested in technical chops, the DOD is also looking closely at pricing and has explicitly asked for specific discounts that would be applied to each component. The RFP process closes on October 12th and the winner is expected to be chosen next April.
What can you say about Amazon? They are by far the dominant cloud infrastructure vendor. They have the advantage of having scored a large government contract in the past when they built the CIA’s private cloud in 2013, earning $600 million for their troubles. It offers GovCloud, which is the product that came out of this project designed to host sensitive data.
Many of the other vendors worry that gives them a leg up on this deal. While five years is a long time, especially in technology terms, if anything, Amazon has tightened control of the market. Heck, most of the other players were just beginning to establish their cloud business in 2013. Amazon, which launched in 2006, has maturity the others lack and they are still innovating, introducing dozens of new features every year. That makes them difficult to compete with, but even the biggest player can be taken down with the right game plan.
If anyone can take Amazon on, it’s Microsoft. While they were somewhat late the cloud they have more than made up for it over the last several years. They are growing fast, yet are still far behind Amazon in terms of pure market share. Still, they have a lot to offer the Pentagon including a combination of Azure, their cloud platform and Office 365, the popular business suite that includes Word, PowerPoint, Excel and Outlook email. What’s more they have a fat contract with the DOD for $900 million, signed in 2016 for Windows and related hardware.
Azure Stack is particularly well suited to a military scenario. It’s a private cloud you can stand up and have a mini private version of the Azure public cloud. It’s fully compatible with Azure’s public cloud in terms of APIs and tools. The company also has Azure Government Cloud, which is certified for use by many of the U.S. government’s branches, including DOD Level 5. Microsoft brings a lot of experience working inside large enterprises and government clients over the years, meaning it knows how to manage a large contract like this.
When we talk about the cloud, we tend to think of the Big Three. The third member of that group is Google. They have been working hard to establish their enterprise cloud business since 2015 when they brought in Diane Greene to reorganize the cloud unit and give them some enterprise cred. They still have a relatively small share of the market, but they are taking the long view, knowing that there is plenty of market left to conquer.
They have taken an approach of open sourcing a lot of the tools they used in-house, then offering cloud versions of those same services, arguing that who knows better how to manage large-scale operations than they do. They have a point, and that could play well in a bid for this contract, but they also stepped away from an artificial intelligence contract with DOD called Project Maven when a group of their employees objected. It’s not clear if that would be held against them or not in the bidding process here.
IBM has been using its checkbook to build a broad platform of cloud services since 2013 when it bought Softlayer to give it infrastructure services, while adding software and development tools over the years, and emphasizing AI, big data, security, blockchain and other services. All the while, it has been trying to take full advantage of their artificial intelligence engine, Watson.
As one of the primary technology brands of the 20th century, the company has vast experience working with contracts of this scope and with large enterprise clients and governments. It’s not clear if this translates to its more recently developed cloud services, or if it has the cloud maturity of the others, especially Microsoft and Amazon. In that light, it would have its work cut out for it to win a contract like this.
Oracle has been complaining since last spring to anyone who will listen, including reportedly the president, that the JEDI RFP is unfairly written to favor Amazon, a charge that DOD firmly denies. They have even filed a formal protest against the process itself.
That could be a smoke screen because the company was late to the cloud, took years to take it seriously as a concept, and barely registers today in terms of market share. What it does bring to the table is broad enterprise experience over decades and one of the most popular enterprise databases in the last 40 years.
It recently began offering a self-repairing database in the cloud that could prove attractive to DOD, but whether its other offerings are enough to help it win this contract remains to be to be seen.
Posted by Richy George on 27 September, 2018This post was originally published on this site
VirusTotal, the virus and malware scanning service own by Alphabet’s Chronicle, launched an enterprise-grade version of its service today. VirusTotal Enterprise offers significantly faster and more customizable malware search, as well as a new feature called Private Graph, which allows enterprises to create their own private visualizations of their infrastructure and malware that affects their machines.
The Private Graph makes it easier for enterprises to create an inventory of their internal infrastructure and users to help security teams investigate incidents (and where they started). In the process of building this graph, VirtusTotal also looks are commonalities between different nodes to be able to detect changes that could signal potential issues.
The company stresses that these graphs are obviously kept private. That’s worth noting because VirusTotal already offered a similar tool for its premium users — the VirusTotal Graph. All of the information there, however, was public.
As for the faster and more advanced search tools, VirusTotal notes that its service benefits from Alphabet’s massive infrastructure and search expertise. This allows VirusTotal Enterprise to offers a 100x speed increase, as well as better search accuracy. Using the advanced search, the company notes, a security team could now extract the icon from a fake application, for example, and then return all malware samples that share the same file.
VirusTotal says that it plans to “continue to leverage the power of Google infrastructure” and expand this enterprise service over time.
Google acquired VirusTotal back in 2012. For the longest time, the service didn’t see too many changes, but earlier this year, Google’s parent company Alphabet moved VirusTotal under the Chronicle brand and the development pace seems to have picked up since.
Posted by Richy George on 27 September, 2018This post was originally published on this site
Over the last several months, Dropbox has been undertaking an overhaul of its internal search engine for the first time since 2015. Today, the company announced that the new version, dubbed Nautilus, is ready for the world. The latest search tool takes advantage of a new architecture powered by machine learning to help pinpoint the exact piece of content a user is looking for.
While an individual user may have a much smaller body of documents to search across than the World Wide Web, the paradox of enterprise search says that the fewer documents you have, the harder it is to locate the correct one. Yet Dropbox faces of a host of additional challenges when it comes to search. It has more than 500 million users and hundreds of billions of documents, making finding the correct piece for a particular user even more difficult. The company had to take all of this into consideration when it was rebuilding its internal search engine.
One way for the search team to attack a problem of this scale was to put machine learning to bear on it, but it required more than an underlying level of intelligence to make this work. It also required completely rethinking the entire search tool from an architectural level.
That meant separating two main pieces of the system, indexing and serving. The indexing piece is crucial of course in any search engine. A system of this size and scope needs a fast indexing engine to cover the number of documents in a whirl of changing content. This is the piece that’s hidden behind the scenes. The serving side of the equation is what end users see when they query the search engine, and the system generates a set of results.
Dropbox described the indexing system in a blog post announcing the new search engine: “The role of the indexing pipeline is to process file and user activity, extract content and metadata out of it, and create a search index.” They added that the easiest way to index a corpus of documents would be to just keep checking and iterating, but that couldn’t keep up with a system this large and complex, especially one that is focused on a unique set of content for each user (or group of users in the business tool).
They account for that in a couple of ways. They create offline builds every few days, but they also watch as users interact with their content and try to learn from that. As that happens, Dropbox creates what it calls “index mutations,” which they merge with the running indexes from the offline builds to help provide ever more accurate results.
The indexing process has to take into account the textual content assuming it’s a document, but it also has to look at the underlying metadata as a clue to the content. They use this information to feed a retrieval engine, whose job is to find as many documents as it can, as fast it can and worry about accuracy later.
It has to make sure it checks all of the repositories. For instance, Dropbox Paper is a separate repository, so the answer could be found there. It also has to take into account the access-level security, only displaying content that the person querying has the right to access.
Once it has a set of possible results, it uses machine learning to pinpoint the correct content. “The ranking engine is powered by a [machine learning] model that outputs a score for each document based on a variety of signals. Some signals measure the relevance of the document to the query (e.g., BM25), while others measure the relevance of the document to the user at the current moment in time,” they explained in the blog post.
After the system has a list of potential candidates, it ranks them and displays the results for the end user in the search interface, but a lot of work goes into that from the moment the user types the query until it displays a set of potential files. This new system is designed to make that process as fast and accurate as possible.
Posted by Richy George on 26 September, 2018This post was originally published on this site
Sometimes $10 billion isn’t as much as you think.
It’s true that when you look at the bottom line number of the $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud contract, it’s easy to get lost in the sheer size of it, and the fact that it’s a one-vendor deal. The key thing to remember as you think about this deal is that while it’s obviously a really big number, it’s spread out over a long period of time and involves a huge and growing market.
It’s also important to remember that the Pentagon has given itself lots of out clauses in the way the contract is structured. This could be important for those who are worried about one vendor having too much power in a deal like this. “This is a two-year contract, with three option periods: one for three years, another for three years, and a final one for two years,” Heather Babb, Pentagon spokeswoman told TechCrunch.
The contract itself has been set up to define the department’s cloud strategy for the next decade. The thinking is that by establishing a relationship with a single vendor, it will improve security and simplify overall management of the system. It’s also part of a broader view of setting technology policy for the next decade and preparing the military for more modern requirements like Internet of Things and artificial intelligence applications.
Many vendors have publicly expressed unhappiness at the winner-take-all, single vendor approach, which they believe might be unfairly tilted toward market leader Amazon. Still, the DOD, which has stated that the process is open and fair, seems determined to take this path, much to the chagrin of most vendors, who believe that a multi-vendor strategy makes more sense.
John Dinsdale, chief analyst at Synergy Research Group, a firm that keeps close tabs on the cloud market, says it’s also important to keep the figure in perspective compared to the potential size of the overall market.
“The current worldwide market run rate is equivalent to approximately $60 billion per year and that will double in less than three years. So in very short order you’re going to see a market that is valued at greater than $100 billion per year – and is continuing to grow rapidly,” he said.
Put in those terms, $10 billion over a decade, while surely a significant figure, isn’t quite market altering if the market size numbers are right. “If the contract is truly worth $10 billion that is clearly a very big number. It would presumably be spread over many years which then puts it at only a very small share of the total market,” he said.
He also acknowledges that it would be a big feather in the cap of whichever company wins the business, and it could open the door for other business in the government and private sector. After all, if you can handle the DOD, chances are you can handle just about any business where a high level of security and governance would be required.
Final RFPs are now due on October 12th with a projected award date of April 2019, but even at $10 billion, an astronomical sum of money to be sure, it ultimately might not shift the market in the way you think.
Posted by Richy George on 26 September, 2018This post was originally published on this site
Instana, an application performance monitoring (APM) service with a focus on modern containerized services, today announced that it has raised a $30 million Series C funding round. The round was led by Meritech Capital, with participation from existing investor Accel. This brings Instana’s total funding to $57 million.
The company, which counts the likes of Audi, Edmunds.com, Yahoo Japan and Franklin American Mortgage as its customers, considers itself an APM 3.0 player. It argues that its solution is far lighter than those of older players like New Relic and AppDynamics (which sold to Cisco hours before it was supposed to go public). Those solutions, the company says, weren’t built for modern software organizations (though I’m sure they would dispute that).
What really makes Instana stand out is its ability to automatically discover and monitor the ever-changing infrastructure that makes up a modern application, especially when it comes to running containerized microservices. The service automatically catalogs all of the endpoints that make up a service’s infrastructure, and then monitors them. It’s also worth noting that the company says that it can offer far more granular metrics that its competitors.
Instana says that its annual sales grew 600 percent over the course of the last year, something that surely attracted this new investment.
“Monitoring containerized microservice applications has become a critical requirement for today’s digital enterprises,” said Meritech Capital’s Alex Kurland. “Instana is packed with industry veterans who understand the APM industry, as well as the paradigm shifts now occurring in agile software development. Meritech is excited to partner with Instana as they continue to disrupt one of the largest and most important markets with their automated APM experience.”
The company plans to use the new funding to fulfill the demand for its service and expand its product line.
Posted by Richy George on 25 September, 2018This post was originally published on this site
When Salesforce bought Mulesoft last spring for the tidy sum of $6.5 billion, it looked like money well spent for the CRM giant. After all, it was providing a bridge between the cloud and the on-prem data center and that was a huge missing link for a company with big ambitions like Salesforce.
When you want to rule the enterprise, you can’t be limited by where data lives and you need to be able to share information across disparate systems. Partly that’s a simple story of enterprise integration, but on another level it’s purely about data. Salesforce introduced its intelligence layer, dubbed Einstein, at Dreamforce in 2016.
With Mulesoft in the fold, it’s got access to data cross systems wherever it lives, in the cloud or on-prem. Data is the is the fuel of artificial intelligence, and Salesforce has been trying desperately to get more data for Einstein since its inception.
It lost out on LinkedIn to Microsoft, which flexed its financial muscles and reeled in the business social network for $26.5 billion a couple of years ago. It’s undoubtedly a rich source of data that the company longed for. Next, it set its sights on Twitter (although Twitter was ultimately never sold, of course). After board and stockholder concerns, the company walked away.
Each of these forays was all about the data, and frustrated, Salesforce went back to the drawing board. While Mulesoft did not supply the direct cache of data that a social network would have, it did provide a neat way for them to get at backend data sources, the very type of data that matters most to its enterprise customers.
Today, they have extended that notion beyond pure data access to a graph. You can probably see where this is going. The idea of a graph, the connections between say a buyer and the things they tend to buy or a person on a social network and people they tend to interact with can be extended even to the network/API level and that is precisely the story that Salesforce is trying to tell this week at the Dreamforce customer conference in San Francisco.
Maureen Fleming, program vice president for integration and process automation research at IDC says that it is imperative that organizations view data as a strategic asset and act accordingly. “Very few companies are getting all the value from their data as they should be, as it is locked up in various applications and systems that aren’t designed to talk to each other. Companies who are truly digitally capable will be able to connect these disparate data sources, pull critical business-level data from these connections, and make informed business decisions in a way that delivers competitive advantage,” Fleming explained in a statement.
It’s hard to underestimate the value of this type of data is to Salesforce, which has already put Mulesoft to work internally to help build the new Customer 360 product announced today. It can point to how it’s providing this very type of data integration to which Fleming is referring on its own product set.
Bret Taylor, president and chief product officer at Salesforce, says that for his company all of this is ultimately about enhancing the customer experience. You need to be able to stitch together these different computing environments and data silos to make that happen.
“In the short term, [customer] infrastructure is often fragmented. They often have some legacy applications on premise, they’ll have some cloud applications like Salesforce, but some infrastructure in on Amazon or Google and Azure, and to actually transform the customer experience, they need to bring all this data together. And so it’s a really a unique time for integration technologies, like Mulesoft because it enables you to create a seamless customer experience, no matter where that
data lives, and that means you don’t need to wait for infrastructure to be perfect before you can transform your customer experience.”
Copyright 2015 - InnovatePC - All Rights Reserved
Site Design By Digital web avenue