Monthly Archives: October 2018

Anaplan hits the ground running with strong stock market debut up over 42 percent

Posted by on 12 October, 2018

This post was originally published on this site

You might think that Anaplan CEO, Frank Calderoni would have had a few sleepless nights this week. His company picked a bad week to go public as market instability rocked tech stocks. Still he wasn’t worried, and today the company had by any measure a successful debut with the stock soaring up over 42 percent. As of 4 pm ET, it hit $24.18, up from the IPO price of $17. Not a bad way to launch your company.

Stock Chart: Yahoo Finance

“I feel good because it really shows the quality of the company, the business model that we have and how we’ve been able to build a growing successful business, and I think it provides us with a tremendous amount of opportunity going forward,” Calderoni told TechCrunch.

Calderoni joined the company a couple of years ago, and seemed to emerge from Silicon Valley central casting as former CFO at Red Hat and Cisco along with stints at IBM and SanDisk. He said he has often wished that there were a tool around like Anaplan when he was in charge of a several thousand person planning operation at Cisco. He indicated that while they were successful, it could have been even more so with a tool like Anaplan.

“The planning phase has not had much change in in several decades. I’ve been part of it and I’ve dealt with a lot of the pain. And so having something like Anaplan, I see it’s really being a disrupter in the planning space because of the breadth of the platform that we have. And then it goes across organizations to sales, supply chain, HR and finance, and as we say, really connects the data, the people and the plan to make for better decision making as a result of all that,” he said.

Calderoni describes Anaplan as a planning and data analysis tool. In his previous jobs he says that he spent a ton of time just gathering data and making sure they had the right data, but precious little time on analysis. In his view Anaplan, lets companies concentrate more on the crucial analysis phase.

“Anaplan allows customers to really spend their time on what I call forward planning where they can start to run different scenarios and be much more predictive, and hopefully be able to, as we’ve seen a lot of our customers do, forecast more accurately,” he said.

Anaplan was founded in 2006 and raised almost $300 million along the way. It achieved a lofty valuation of $1.5 billion in its last round, which was $60 million in 2017. The company has just under 1000 customers including Del Monte, VMware, Box and United.

Calderoni says although the company has 40 percent of its business outside the US, there are plenty of markets left to conquer and they hope to use today’s cash infusion in part to continue to expand into a worldwide company.

Posted Under: Tech News
IBM files formal JEDI protest a day before bidding process closes

Posted by on 12 October, 2018

This post was originally published on this site

IBM announced yesterday that it has filed a formal protest with the U.S. Government Accountability Office over the structure of the Pentagon’s winner-take-all $10 billion, 10-year JEDI cloud contract. The protest came just a day before the bidding process is scheduled to close. As IBM put it in a blog post, they took issues with the single vendor approach. They are certainly not alone.

Just about every vendor short of Amazon, which has remained mostly quiet, has been complaining about this strategy. IBM certainly faces a tough fight going up against Amazon and Microsoft.

IBM doesn’t disguise the fact that it thinks the contract has been written for Amazon to win and they believe the one-vendor approach simply doesn’t make sense. “No business in the world would build a cloud the way JEDI would and then lock in to it for a decade. JEDI turns its back on the preferences of Congress and the administration, is a bad use of taxpayer dollars and was written with just one company in mind.” IBM wrote in the blog post explaining why it was protesting the deal before a decision was made or the bidding was even closed.

For the record, DOD spokesperson Heather Babb told TechCrunch last month that the bidding is open and no vendor is favored. “The JEDI Cloud final RFP reflects the unique and critical needs of DOD, employing the best practices of competitive pricing and security. No vendors have been pre-selected,” she said.

Much like Oracle, which filed a protest of its own back in August, IBM is a traditional vendor that was late to the cloud. It began a journey to build a cloud business in 2013 when it purchased Infrastructure as a Service vendor SoftLayer and has been using its checkbook to buy software services to add on top of SoftLayer ever since. IBM has concentrated on building cloud services around AI, security, big data, blockchain and other emerging technologies.

Both IBM and Oracle have a problem with the one-vendor approach, especially one that locks in the government for a 10-year period. It’s worth pointing out that the contract actually is an initial two-year deal with two additional three year options and a final two year option. The DOD has left open the possibility this might not go the entire 10 years.

It’s also worth putting the contract in perspective. While 10 years and $10 billion is nothing to sneeze at, neither is it as market altering as it might appear, not when some are predicting the cloud will be $100 billion a year market very soon.

IBM uses the blog post as a kind of sales pitch as to why it’s a good choice, while at the same time pointing out the flaws in the single vendor approach and complaining that it’s geared toward a single unnamed vendor that we all know is Amazon.

The bidding process closes today, and unless something changes as a result of these protests, the winner will be selected next April

Posted Under: Tech News
New Relic acquires Belgium’s CoScale to expand its monitoring of Kubernetes containers and microservices

Posted by on 11 October, 2018

This post was originally published on this site

New Relic, a provider of analytics and monitoring around a company’s internal and external facing apps and services to help optimise their performance, is making an acquisition today as it continues to expand a newer area of its business, containers and microservices. The company has announced that it has purchased CoScale, a provider of monitoring for containers and microservices, with a specific focus on Kubernetes.

Terms of the deal — which will include the team and technology — are not being disclosed, as it will not have a material impact on New Relic’s earnings. The larger company is traded on the NYSE (ticker: NEWR) and has been a strong upswing in the last two years, and its current market cap its around $4.6 billion.

Originally founded in Belgium, CoScale had raised $6.4 million and was last valued at $7.9 million, according to PitchBook. Investors included Microsoft (via its ScaleUp accelerator), PMV and the Qbic Fund, two Belgian investors.

We are thrilled to bring CoScale’s knowledge and deeply technical team into the New Relic fold,” noted Ramon Guiu, senior director of product management at New Relic. “The CoScale team members joining New Relic will focus on incorporating CoScale’s capabilities and experience into continuing innovations for the New Relic platform.”

The deal underscores how New Relic has had to shift in the last couple of years: when the company was founded years ago, application monitoring was a relatively easy task, with the web and a specified number of services the limit of what needed attention. But services, apps and functions have become increasingly complex and now tap data stored across a range of locations and devices, and processing everything generates a lot of computing demand.

New Relic first added container and microservices monitoring to its stack in 2016. That’s a somewhat late arrival to the area, New Relic CEO Lew Cirne believes that it’s just at the right time, dovetailing New Relic’s changes with wider shifts in the market.

‘We think those changes have actually been an opportunity for us to further differentiate and further strengthen our thesis that the New Relic  way is really the most logical way to address this,” he told my colleague Ron Miller last month. As Ron wrote, Cirne’s take is that New Relic has always been centered on the code, as opposed to the infrastructure where it’s delivered, and that has helped it make adjustments as the delivery mechanisms have changed.

New Relic already provides monitoring for Kubernetes, Google Kubernetes Engine (GKE), Amazon Elastic Container Service for Kubernetes (EKS), Microsoft Azure Kubernetes Service (AKS), and RedHat Openshift, and the idea is that CoScale will help it ramp up across that range, while also adding Docker and OpenShift to the mix, as well as offering new services down the line to serve the DevOps community.

“The visions of New Relic and CoScale are remarkably well aligned, so our team is excited that we get to join New Relic and continue on our journey of helping companies innovate faster by providing them visibility into the performance of their modern architectures,” said CoScale CEO Stijn Polfliet, in a statement. “[Co-founder] Fred [Ryckbosch] and I feel like this is such an exciting space and time to be in this market, and we’re thrilled to be teaming up with the amazing team at New Relic, the leader in monitoring modern applications and infrastructure.”

Posted Under: Tech News
Zuora partners with Amazon Pay to expand subscription billing options

Posted by on 11 October, 2018

This post was originally published on this site

Zuora, the SaaS company helping organizations manage payments for subscription businesses, announced today that it had been selected as a Premier Partner in the Amazon Pay Global Partner Program. 

The “Premier Partner” distinction means businesses using Zuora’s billing platform can now easily integrate Amazon’s digital payment system as an option during checkout or recurring payment processes. 

The strategic rationale for Zuora is clear, as the partnership expands the company’s product offering to prospective and existing customers.  The ability to support a wide array of payment methodologies is a key value proposition for subscription businesses that enables them to service a larger customer base and provide a more seamless customer experience.

It also doesn’t hurt to have a deep-pocketed ally like Amazon in a fairly early-stage industry.  With omnipotent tech titans waging war over digital payment dominance, Amazon has reportedly doubled down on efforts to spread Amazon Pay usage, cutting into its own margins and offering incentives to retailers.

As adoption of Amazon Pay spreads, subscription businesses will be compelled to offer the service as an available payment option and Zuora should benefit from supporting early billing integration.

For Amazon Pay, teaming up with Zuora provides direct access to Zuora’s customer base, which caters to tens of millions of subscribers. 

With Zuora minimizing the complexity of adding additional payment options, which can often disrupt an otherwise unobtrusive subscription purchase experience, the partnership with Zuora should help spur Amazon Pay adoption and reduce potential friction.

“By extending the trust and convenience of the Amazon experience to Zuora, merchants around the world can now streamline the subscription checkout experience for their customers,” said Vice President of Amazon Pay, Patrick Gauthier.  “We are excited to be working with Zuora to accelerate the Amazon Pay integration process for their merchants and provide a fast, simple and secure payment solution that helps grow their business.”

The world subscribed

The collaboration with Amazon Pay represents another milestone for Zuora, which completed its IPO in April of this year and is now looking to further differentiate its offering from competing in-house systems or large incumbents in the Enterprise Resource Planning (ERP) space, such as Oracle or SAP.   

Going forward, Zuora hopes to play a central role in ushering a broader shift towards a subscription-based economy. 

Tien Tzuo, founder and CEO of Zuora, told TechCrunch he wants the company to help businesses first realize they should be in the subscription economy and then provide them with the resources necessary to flourish within it.

“Our vision is the world subscribed.”  said Tzuo. “We want to be the leading company that has the right technology platform to get companies to be successful in the subscription economy.”

The partnership will launch with publishers “The Seattle Times” and “The Telegraph”, with both now offering Amazon Pay as a payment method while running on the Zuora platform.

Posted Under: Tech News
Snowflake scoops up another blizzard of cash with $450 million round

Posted by on 11 October, 2018

This post was originally published on this site

When Snowflake, the cloud data warehouse, landed a $263 million investment earlier this year, CEO Bob Muglia speculated that it would be the last money his company would need before an eventual IPO. But just 9 months after that statement, the company announced a second even larger round. This time it’s getting $450 million, as an unexpected level of growth led them to seek additional cash.

Sequoia Capital led the round, joined by new investor Meritech Capital and existing investors Altimeter Capital, Capital One Growth Ventures, Madrona Venture Group, Redpoint Ventures, Sutter Hill Ventures and Wing Ventures. Today’s round brings the total raised to over $928 million with $713 million coming just this year. That’s a lot of dough.

Oh and the valuation has skyrocketed too from $1.5 billion in January to $3.5 billion with today’s investment. “We are increasing the valuation from the prior round substantially, and it’s driven by the growth numbers of almost quadrupling the revenue, and tripling the customer base,” company CFO Thomas Tuchscherer told TechCrunch.

At the time of the $263 million round, Muglia was convinced the company had enough funds and that the next fundraise would be an IPO. “We have put ourselves on the path to IPO. That’s our mid- to long-term plan. This funding allows us to go directly to IPO and gives us sufficient capital, that if we choose, IPO would be our next funding step,” he said in January.

Tuchscherer said in fact that was the plan at the time of the first batch of funding. He joined the company, partly because of his experience bringing Talend public in 2016, but he said the growth has been so phenomenal, that they felt it was necessary to change course.

“When we raised $263 million earlier in the year, we raised based on a plan that was ambitious in terms of growth and investment. We are exceeding and beating that, and it prompted us to explore how do we accelerate investment to continue driving the company’s growth,” he said.

Running on both Amazon Web Services and Microsoft Azure, which they added as a supported platform earlier this year, certainly contributed to the increased sales, and forced them to rethink the amount of money it would take to fuel their growth spurt.

“I think it’s very important as a distinction that we view the funding as being customer driven in the sense that in order to meet the demand that we’re seeing in the market for Snowflake, we have to invest in our infrastructure, as well as in our R&D capacity. So  the funding that we’re raising now is meant to finance those two core investments,” he stressed

The number of employees is skyrocketing as the company adds customers. Just eight months ago the company had around 350 employees. Today it has close to 650. Tuchscherer expects that to grow to between 900 and 1000 by the end of January, not that far off.

As for that IPO, surely that is still a goal, but the growth simply got in the way. “We are building the company to be autonomous and to be a large independent company. It’s definitely on the horizon,” he said.

While Tuchscherer wouldn’t definitively say that the company is looking to support at least one more cloud platform in addition to Amazon and Microsoft, he strongly hinted that such a prospect could happen.

The company also plans to plunge a lot of money into the sales team, building out new sales offices in the US and doubling their presence around the world, while also enhancing the engineering and R&D teams to expand their product offerings.

Just this year alone the company has added Netflix, Office Depot, DoorDash, Netgear, Ebates and Yamaha as customers. Other customers include Capital One, Lions Gate and Hubspot.

Posted Under: Tech News
Google+ for G Suite lives on and gets new features

Posted by on 11 October, 2018

This post was originally published on this site

You thought Google+ was dead, didn’t you? And it is — if you’re a consumer. But the business version of Google’s social network will live on for the foreseeable future — and it’s getting a bunch of new features today.

Google+ for G Suite isn’t all that different from the Google+ for consumers, but its focus is very much on allowing users inside a company to easily share information. Current users include the likes of Nielsen and French retailer Auchan.

The new features that Google is announcing today give admins more tools for managing and reviewing posts, allow employees to tag content and provide better engagement metrics to posters.

Recently Google introduced the ability for admins to bulk-add groups of users to a Google+ community, for example. And soon, those admins will be able to better review and moderate posts made by their employees. Soon, admins will also be able to define custom streams so that employees could get access to a stream with all of the posts from a company’s leadership team, for example.

But what’s maybe more important in this context is that tags now make it easy for employees to route content to everybody in the company, no matter which group they work in. “Even if you don’t know all employees across an organization, tags makes it easier to route content to the right folks,” the company explains in today’s blog post. “Soon you’ll be able to draft posts and see suggested tags, like #research or #customer-insights when posting customer survey results.”

As far as the new metrics go, there’s nothing all that exciting going on here, but G Suite customers who keep their reporting structure in the service will be able to provide analytics to employees so they can see how their posts are being viewed across the company and which teams engage most with them.

At the end of the day, none of these are revolutionary features. But the timing of today’s announcement surely isn’t a coincidence, given that Google announced the death of the consumer version of Google+ — and the data breach that went along with that — only a few days ago. Today’s announcement is clearly meant to be a reminder that Google+ for the enterprise isn’t going away and remains in active development. I don’t think all that many businesses currently use Google+, though, and with Hangouts Chat and other tools, they now have plenty of options for sharing content across groups.

Posted Under: Tech News
Google’s Apigee officially launches its API monitoring service

Posted by on 11 October, 2018

This post was originally published on this site

It’s been about two years since Google acquired API management service Apigee. Today, the company is announcing new extensions that make it easier to integrate the service with a number of Google Cloud services, as well as the general availability of the company’s API monitoring solution.

Apigee API monitoring allows operations teams to get more insight into how their APIs are performing. The idea here is to make it easy for these teams to figure out when there’s an issue and what’s the root cause for it by giving them very granular data. “APIs are now part of how a lot of companies are doing business,” Ed Anuff, Apigee’s former SVP of product strategy and now Google’s product and strategy lead for the service, told me. “So that tees up the need for API monitoring.”

Anuff also told me that he believes that it’s still early days for enterprise API adoption — but that also means that Apigee is currently growing fast as enterprise developers now start adopting modern development techniques. “I think we’re actually still pretty early in enterprise adoption of APIs,” he said. “So what we’re seeing is a lot more customers going into full production usage of their APIs. A lot of what we had seen before was people using it for maybe an experiment or something that they were doing with a couple of partners.” He also attributed part of the recent growth to customers launching more mobile applications where APIs obviously form the backbone of much of the logic that drives those apps.

API Monitoring was already available as a beta, but it’s now generally available to all Apigee customers.

Given that it’s now owned by Google, it’s no surprise that Apigee is also launching deeper integrations with Google’s cloud services now — specifically services like BigQuery, Cloud Firestore, Pub/Sub, Cloud Storage and Spanner. Some Apigee customers are already using this to store every message passed through their APIs to create extensive logs, often for compliance reasons. Others use Cloud Firestore to personalize content delivery for their web users or to collect data from their APIs and then send that to BigQuery for analysis.

Anuff stressed that Apigee remains just as open to third-party integrations as it always was. That is part of the core promise of APIs, after all.

Posted Under: Tech News
Google introduces dual-region storage buckets to simplify data redundancy

Posted by on 11 October, 2018

This post was originally published on this site

Google is playing catch-up in the cloud, and as such it wants to provide flexibility to differentiate itself from AWS and Microsoft. Today, the company announced a couple of new options to help separate it from the cloud storage pack.

Storage may seem stodgy, but it’s a primary building block for many cloud applications. Before you can build an application you need the data that will drive it, and that’s where the storage component comes into play.

One of the issues companies have as they move data to the cloud is making sure it stays close to the application when it’s needed to reduce latency. Customers also require redundancy in the event of a catastrophic failure, but still need access with low latency. The latter has been a hard problem to solve until today when Google introduced a new dual-regional storage option.

As Google described it in the blog post announcing the new feature, “With this new option, you write to a single dual-regional bucket without having to manually copy data between primary and secondary locations. No replication tool is needed to do this and there are no network charges associated with replicating the data, which means less overhead for you storage administrators out there. In the event of a region failure, we transparently handle the failover and ensure continuity for your users and applications accessing data in Cloud Storage.”

This allows companies to have redundancy with low latency, while controlling where it goes without having to manually move it should the need arise.

Knowing what you’re paying

Companies don’t always require instant access to data, and Google (and other cloud vendors) offer a variety of storage options, making it cheaper to store and retrieve archived data. As of today, Google is offering a clear way to determine costs, based on customer storage choice types. While it might not seem revolutionary to let customers know what they are paying, Dominic Preuss, Google’s director of product management says it hasn’t always been a simple matter to calculate these kinds of costs in the cloud. Google decided to simplify it by clearly outlining the costs for medium (Nearline) and long-term (Coldline) storage across multiple regions.

As Google describes it, “With multi-regional Nearline and Coldline storage, you can access your data with millisecond latency, it’s distributed redundantly across a multi-region (U.S., EU or Asia), and you pay archival prices. This is helpful when you have data that won’t be accessed very often, but still needs to be protected with geographically dispersed copies, like media archives or regulated content. It also simplifies management.”

Under the new plan, you can select the type of storage you need, the kind of regional coverage you want and you can see exactly what you are paying.

Google Cloud storage pricing options. Chart: Google

Each of these new storage services has been designed to provide additional options for Google Cloud customers, giving them more transparency around pricing and flexibility and control over storage types, regions and the way they deal with redundancy across data stores.

Posted Under: Tech News
Google expands its identity management portfolio for businesses and developers

Posted by on 11 October, 2018

This post was originally published on this site

Over the course of the last year, Google has launched a number of services that bring to other companies the same BeyondCorp model for managing access to a company’s apps and data without a VPN that it uses internally. Google’s flagship product for this is Cloud Identity, which is essentially Google’s BeyondCorp, but packaged for other businesses.

Today, at its Cloud Next event in London, it’s expanding this portfolio of Cloud Identity services with three new products and features that enable developers to adopt this way of thinking about identity and access for their own apps and that make it easier for enterprises to adopt Cloud Identity and make it work with their existing solutions.

The highlight of today’s announcements, though, is Cloud Identity for Customers and Partners, which is now in beta. While Cloud Identity is very much meant for employees at a larger company, this new product allows developers to build into their own applications the same kind of identity and access management services.

“Cloud Identity is how we protect our employees and you protect your workforce,” Karthik Lakshminarayanan, Google’s product management director for Cloud Identity, said in a press briefing ahead of the announcement. “But what we’re increasingly finding is that developers are building applications and are also having to deal with identity and access management. So if you’re building an application, you might be thinking about accepting usernames and passwords, or you might be thinking about accepting social media as an authentication mechanism.”

This new service allows developers to build in multiple ways of authenticating the user, including through email and password, Twitter, Facebook, their phones, SAML, OIDC and others. Google then handles all of that authentication work. Google will offer both client-side (web, iOS and Android) and server-side SDKs (with support for Node.ja, Java, Python and other languages).

“They no longer have to worry about getting hacked and their passwords and their user credentials getting compromised,” added Lakshminarayanan, “They can now leave that to Google and the exact same scale that we have, the security that we have, the reliability that we have — that we are using to protect employees in the cloud — can now be used to protect that developer’s applications.”

In addition to Cloud Identity for Customers and Partners, Google is also launching a new feature for the existing Cloud Identity service, which brings support for traditional LDAP-based applications and IT services like VPNs to Cloud Identity. This feature is, in many ways, an acknowledgment that most enterprises can’t simply turn on a new security paradigm like BeyondCorp/Cloud Identity. With support for secure LDAP, these companies can still make it easy for their employees to connect to these legacy applications while still using Cloud Identity.

“As much as Google loves the cloud, a mantra that Google has is ‘let’s meet customers where they are.’ We know that customers are embracing the cloud, but we also know that they have a massive, massive footprint of traditional applications,” Lakshminarayanan explained. He noted that most enterprises today run two solutions: one that provides access to their on-premise applications and another that provides the same services for their cloud applications. Cloud Identity now natively supports access to many of these legacy applications, including Aruba Networks (HPE), Itopia, JAMF, Jenkins (Cloudbees), OpenVPN, Papercut, pfSense (Netgate), Puppet, Sophos and Splunk. Indeed, as Google notes, virtually any application that supports LDAP over SSL can work with this new service.

Finally, the third new feature Google is launching today is context-aware access for those enterprises that already use its Cloud Identity-Aware Proxy (yes, those names are all a mouthful). The idea here is to help enterprises provide access to cloud resources based on the identity of the user and the context of the request — all without using a VPN. That’s pretty much the promise of BeyondCorp in a nutshell, and this implementation, which is now in beta, allows businesses to manage access based on the user’s identity and a device’s location and its security status, for example. Using this new service, IT managers could restrict access to one of their apps to users in a specific country, for example.

 

Posted Under: Tech News
Google Cloud expands its networking feature with Cloud NAT

Posted by on 11 October, 2018

This post was originally published on this site

It’s a busy week for news from Google Cloud, which is hosting its Next event in London. Today, the company used the event to launch a number of new networking features. The marquee launch today is Cloud NAT, a new service that makes it easier for developers to build cloud-based services that don’t have public IP addresses and can only be accessed from applications within a company’s virtual private cloud.

As Google notes, building this kind of setup was already possible, but it wasn’t easy. Obviously, this is a pretty common use case, though, so with Cloud NAT, Google now offers a fully managed service that handles all the network address translation (hence the NAT) and provides access to these private instances behind the Cloud NAT gateway.

Cloud NAT supports Google Compute Engine virtual machines as well as Google Kubernetes Engine containers, and offers both a manual mode where developers can specify their IPs and an automatic mode where IPs are automatically allocated.

Also new in today’s release is Firewall Rules Logging, which is now in beta. Using this feature, admins can audit, verify and analyze the effects of their firewall rules. That means when there are repeated connection attempts that the firewall blocked, you can now analyze those and see whether somebody was up to no good or whether somebody misconfigured the firewall. Because the data is only delayed by about five seconds, the service provides near real-time access to this data — and you can obviously tie this in with other services like Stackdriver Logging, Cloud Pub/Sub and BigQuery to create alerts and further analyze the data.

Also new today is managed TLS certificated for HTTPS load balancers. The idea here is to take the hassle out of managing TLS certificates (the kind of certificates that ensure that your user’s browser creates a secure connection to your app) when there is a load balancer in play. This feature, too, is now in beta.

Posted Under: Tech News
Page 4 of 7« First...23456...Last »

Social Media

Bulk Deals

Subscribe for exclusive Deals

Recent Post

Archives

Facebook

Twitter

Subscribe for exclusive Deals




Copyright 2015 - InnovatePC - All Rights Reserved

Site Design By Digital web avenue