All posts by Richy George

Gmail gets a useful right-click menu

Posted by on 11 February, 2019

This post was originally published on this site

Google is giving Gmail a new right-click menu. And it’s about time. While you’ve long been able to right-click on any email in your inbox, your options were always limited. You could archive an email, mark it as read/unread and delete it, but that was about it. Now, as the company announced today, that’s changing and you’re about to get a fully-featured right-click menu that lets you do most of the things that Gmail’s top bar menu lets you do, plus a few extra features.

Soon, when you right-click on a message in your inbox view, you’ll see a long list of features with options to reply to messages and forward them, search for all emails from a sender or with the same subject, and open multiple emails in multiple windows at the same time. You’ll also be able to add labels to emails, mute conversations and use Gmail’s snooze feature, all from the same menu.

All of this is pretty straightforward stuff and none of it is especially groundbreaking, which makes you wonder why it took Google so long to implement this.

As usual, Google only tells us that it is rolling this feature out to G Suite users now (starting today for those on the rapid release schedule and on February 22 for those that follow the slower scheduled release cycle). But free users typically see these new features pop up somewhere around that same timeframe, too.

Posted Under: Tech News
Google Docs gets an API for task automation

Posted by on 11 February, 2019

This post was originally published on this site

Google today announced the general availability of a new API for Google Docs that will allow developers to automate many of the tasks that users typically do manually in the company’s online office suite. The API has been in developer preview since last April’s Google Cloud Next 2018 and is now available to all developers.

As Google notes, the REST API was designed to help developers build workflow automation services for their users, build content management services and create documents in bulk. Using the API, developers can also set up processes that manipulate documents after the fact to update them, and the API also features the ability to insert, delete, move, merge and format text, insert inline images and work with lists, among other things.

The canonical use case here is invoicing, where you need to regularly create similar documents with ever-changing order numbers and line items based on information from third-party systems (or maybe even just a Google Sheet). Google also notes that the API’s import/export abilities allow you to use Docs for internal content management systems.

Some of the companies that built solutions based on the new API during the preview period include Zapier, Netflix, Mailchimp and Final Draft. Zapier integrated the Docs API into its own workflow automation tool to help its users create offer letters based on a template, for example, while Netflix used it to build an internal tool that helps its engineers gather data and automate its documentation workflow.

 

 

Posted Under: Tech News
Carbonite to acquire endpoint security company Webroot for $618.5M

Posted by on 8 February, 2019

This post was originally published on this site

Carbonite, the online backup and recovery company based in Boston, announced late yesterday that it will be acquiring Webroot, an endpoint security vendor, for $618.5 million in cash.

The company believes that by combining its cloud backup service with Webroot’s endpoint security tools, it will give customers a more complete solution. Webroot’s history actually predates the cloud, having launched in 1997. The private company reported $250 million in revenue for fiscal 2018, according to data provided by Carbonite . That will combine with Carbonite’s $296.4 million in revenue for the same time period.

Carbonite CEO and president Mohamad Ali saw the deal as a way to expand the Carbonite offering. “With threats like ransomware evolving daily, our customers and partners are increasingly seeking a more comprehensive solution that is both powerful and easy to use. Backup and recovery, combined with endpoint security and threat intelligence, is a differentiated solution that provides one, comprehensive data protection platform,” Ali explained in a statement.

The deal, not only enhances Carbonite’s backup offering, it gives the company access to a new set of customers. While Carbonite sells mainly through Value Added Resellers (VARs), Webroot’s customers are mainly 14,000 Managed Service Providers (MSPs). That lack of overlap could increase its market reach through to the MSP channel. Webroot has 300,000 customers, according to Carbonite.

This is not the first Carbonite acquisition. It has acquired several other companies over the last several years including buying Mozy from Dell a year ago for $145 million. The acquisition strategy is about using its checkbook to expand the capabilities of the platform to offer a more comprehensive set of tools beyond core backup and recovery.

Graphic: Carbonite

The company announced it is using cash on hand and a $550 million loan from Barclays, Citizens Bank and RBC Capital Markets to finance the deal. Per usual, the acquisition will be subject to regulatory approval, but is expected to close this quarter.

Posted Under: Tech News
Someone could scoop up Slack before it IPOs

Posted by on 7 February, 2019

This post was originally published on this site

Earlier this week, Slack announced that it has filed the paperwork to go public at some point later this year. The big question is, will the company exit into the public markets as expected, or will one of the technology giants swoop in at the last minute with buckets of cash and take them off the market?

Slack, which raised more than $1 billion on an other-worldly $7 billion valuation, is an interesting property. It has managed to grow and be successful while competing with some of the world’s largest tech companies — Microsoft, Cisco, Facebook, Google and Salesforce. Not coincidentally, these deep-pocketed companies could be the ones that come knock, knock, knocking at Slack’s door.

Slack has managed to hold its own against these giants by doing something in this space that hadn’t been done effectively before. It made it easy to plug in other services, effectively making Slack a work hub where you could spend your day because your work could get pushed to you there from other enterprise apps.

As I’ve discussed before, this centralized hub has been a dream of communications tools for most of the 21st century. It began with enterprise IM tools in the early 2000s, and progressed to Enterprise 2.0 tools in the 2007 time frame. That period culminated in 2012 when Microsoft bought Yammer for $1.2 billion, the only billion-dollar exit for that generation of tools.

I remember hearing complaints about Enterprise 2.0 tools. While they had utility, in many ways they were just one more thing employees had to check for information beyond email. The talk was these tools would replace email, but a decade later email’s still standing and that generation of tools has been absorbed.

In 2013, Slack came along, perhaps sensing that Enterprise 2.0 never really got mobile and the cloud, and it recreated the notion in a more modern guise. By taking all of that a step further and making the tool a kind of workplace hub, it has been tremendously successful, growing to 8 million daily users in roughly 4 years, around 3 million of which were the paying variety, at last count.

Slack’s growth numbers as of May 2018

All of this leads us back to the exit question. While the company has obviously filed for IPO paperwork, it might not be the way it ultimately exits. Just the other day CNBC’s Jay Yarrow posited this questions on Twitter:

Not sure where he pulled that number from, but if you figure 3x valuation, that could be the value for a company of this ilk. There would be symmetry in Microsoft buying Slack six years after it plucked Yammer off the market, and it would remove a major competitive piece from the board, while allowing Microsoft access to Slack’s growing customer base.

Nobody can see into the future, and maybe Slack does IPO and takes its turn as a public company, but it surely wouldn’t be a surprise if someone came along with an offer it couldn’t refuse, whatever that figure might be.

Posted Under: Tech News
Google open sources ClusterFuzz

Posted by on 7 February, 2019

This post was originally published on this site

Google today announced that it is open sourcing ClusterFuzz, a scalable fuzzing tool that can run on clusters with over 25,000 machines.

The company has long used the tool internally and if you’ve paid particular attention to Google’s fuzzing efforts (and you have, right?), then this may all seem a bit familiar. That’s because Google launched the OSS-Fuzz service a couple of years ago and that service actually used ClusterFuzz. OSS-Fuzz was only available to open source projects, though, while ClusterFuzz is now available for anyone to use.

The overall concept behind fuzzing is pretty straightforward: you basically throw lots of data (including random inputs) at your application and see how it reacts. Often, it’ll crash, but sometimes you’ll be able to find memory leaks and security flaws. Once you start anything at scale, though, it becomes more complicated and you’ll need tools like ClusterFuzz to manage that complexity.

ClusterFuzz automates the fuzzing process all the way from bug detection to reporting — and then retesting the fix. The tool itself also uses open source libraries like the libFuzzer fuzzing engine and the AFL fuzzer to power some of the core fuzzing features that generate the test cases for the tool.

Google says it has used the tool to find over 16,000 bugs in Chrome and 11,000 bugs in over 160 open source projects that used OSS-Fuzz. Since so much of the software testing and deployment toolchain is now generally automated, its no surprise that fuzzing is also becoming a hot topic these days (I’ve seen references to “continuous fuzzing” pop up quite a bit recently).

Posted Under: Tech News
Microsoft Azure sets its sights on more analytics workloads

Posted by on 7 February, 2019

This post was originally published on this site

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Posted Under: Tech News
Gong.io nabs $40M investment to enhance CRM with voice recognition

Posted by on 7 February, 2019

This post was originally published on this site

With traditional CRM tools, sales people add basic details about the companies to the database, then a few notes about their interactions. AI has helped automate some of that, but Gong.io wants to take it even further using voice recognition to capture every word of every interaction. Today, it got a $40M Series B investment.

The round was led by Battery Ventures with existing investors Norwest Venture Partners, Shlomo Kramer, Wing Venture Capital, NextWorld Capital and Cisco Investments also participating. Battery general partner Dharmesh Thakker will join the startup’s Board under the terms of the deal. Today’s investment brings the total raised so far to $68 million, according to the company.

$40 million is a hefty Series B, but investors see a tool that has the potential to have a material impact on sales, or at least give management a deeper understanding of why a deal succeeded or failed using artificial intelligence, specifically natural language processing.

Company co-founder and CEO Amit Bendov says the solution starts by monitoring all customer-facing conversation and giving feedback in a fully automated fashion. “Our solution uses AI to extract important bits out of the conversation to provide insights to customer-facing people about how they can get better at what they do, while providing insights to management about how staff is performing,” he explained. It takes it one step further by offering strategic input like how your competitors are trending or how are customers responding to your products.

Screenshot: Gong.io

Bendov says he started the company because he has had this experience at previous startups where he wants to know more about why he lost a sale, but there was no insight from looking at the data in the CRM database. “CRM could tell you what customers you have, how many sales you’re making, who is achieving quota or not, but never give me the information to rationalize and improve operations,” he said.

The company currently has 350 customers, a number that has more than tripled since the end of 2017 when it had 100. He says it’s not only that it’s adding new customers, existing ones are expanding, and he says that there is almost zero churn.

Today, Gong has 120 employees with headquarters in San Francisco and a 55-person R&D team in Israel. Bendov expects the number of employees to double over the next year with the new influx of money to keep up with the customer growth.

Posted Under: Tech News
Google doubles down on its Asylo confidential computing framework

Posted by on 6 February, 2019

This post was originally published on this site

Last May, Google introduced Asylo, an open source framework for confidential computing, a technique favored by many of the big cloud vendors because it allows you to set up trusted execution environments that are shielded from the rest of the (potentially untrusted) system. Workloads and their data basically sit in a trusted enclave that adds another layer of protection against network and operating system vulnerabilities.

That’s not a new concept, but as Google argues, it has been hard to adopt. “Despite this promise, the adoption of this emerging technology has been hampered by dependence on specific hardware, complexity and the lack of an application development tool to run in confidential computing environments,” Google Cloud Engineering Director Jason Garms and Senior Product Manager Nelly Porter write in a blog post today. The promise of the Asylo framework, as you can probably guess, is to make confidential computing easy.

Asylo makes it easier to build applications that can run in these enclaves and can use various software- and hardware-based security back ends like Intel’s SGX and others. Once an app has been ported to support Asylo, you should also be able to take that code with you and run in on any other Asylo-supported enclave.

Right now, though, many of these technologies and practices around confidential computing remain in flux. Google notes that there are no set design patterns for building applications that then use the Asylo API and run in these enclaves, for example.The different hardware manufacturers also don’t necessarily work together to ensure their technologies are interoperable.

“Together with the industry, we can work toward more transparent and interoperable services to support confidential computing apps, for example, making it easy to understand and verify attestation claims, inter-enclave communication protocols, and federated identity systems across enclaves,” write Garms and Porter.

And to do that, Google is launching its Confidential Computing Challenge (C3) today. The idea here is to have developers create novel use cases for confidential computing — or to advance the current state of the technologies. If you do that and win, you’ll get $15,000 in cash, $5,000 in Google Cloud Platform credits and an undisclosed hardware gift (a Pixelbook or Pixel phone, if I had to guess).

In additionl, Google now also offers developers three hands-on labs that teach how to build apps using Asylo’s tools. Those are free for the first month if you use the code in Google’s blog post.

Posted Under: Tech News
Big companies are not becoming data-driven fast enough

Posted by on 6 February, 2019

This post was originally published on this site

I remember watching MIT professor Andrew McAfee years ago telling stories about the importance of data over gut feeling, whether it was predicting successful wines or making sound business decisions. We have been hearing about big data and data-driven decision making for so long, you would think it has become hardened into our largest organizations by now. As it turns out, new research by NewVantage Partners finds that most large companies are having problems implementing an organization-wide, data-driven strategy.

McAfee was fond of saying that before the data deluge we have today, the way most large organizations made decisions was via the HiPPO — the highest paid person’s opinion. Then he would chide the audience that this was not the proper way to run your business. Data, not gut feelings, even those based on experience, should drive important organizational decisions.

While companies haven’t failed to recognize McAfee’s advice, the NVP report suggests they are having problems implementing data-driven decision making across organizations. There are plenty of technological solutions out there today to help them from startups all the way to the largest enterprise vendors, but the data (see, you always need to go back to the data) suggests that it’s not a technology problem, it’s people problem.

Executives can have farsighted vision that their organizations need to be data-driven. They can acquire all of the latest solutions to bring data to the forefront, but unless they combine that with a broad cultural shift and a deep understanding of how to use that data inside business processes, they will continue to struggle.

The study’s authors, Randy Bean and Thomas H. Davenport, wrote about the people problem in their study’s executive summary. “We hear little about initiatives devoted to changing human attitudes and behaviors around data. Unless the focus shifts to these types of activities, we are likely to see the same problem areas in the future that we’ve observed year after year in this survey.”

The survey found that 72 percent of respondents have failed in this regard, reporting they haven’t been able to create a data-driven culture, whatever that means to individual respondents. Meanwhile, 69 percent reported they had failed to create a data-driven organization, although it would seem that these two metrics would be closely aligned.

Perhaps most discouraging of all is that the data is trending the wrong way. Over the last several years, the report’s authors say that those organizations calling themselves data-driven has actually dropped each year from 37.1% in 2017 to 32.4% in 2018 to 31.0% in the latest survey.

This matters on so many levels, but consider that as companies shift to artificial intelligence and machine learning, these technologies rely on abundant amounts of data to work effectively. What’s more, every organization regardless of its size, is generating vast amounts of data, simply as part of being a digital business in the 21st century. They need to find a way to control this data to make better decisions and understand their customers better. It’s essential.

There is so much talk about innovation and disruption, and understanding and affecting company culture, but so much of all this is linked. You need to be more agile. You need to be more digital. You need to be transformational. You need to be all of these things — and data is at the center of all of it.

Data has been called the new oil often enough to be cliche, but these results reveal that the lesson is failing to get through. Companies need to be data-driven now, this instant. This isn’t something to be working towards at this point. This is something you need to be doing, unless your ultimate goal is to become irrelevant.

Posted Under: Tech News
vArmour, a security startup focused on multi-cloud deployments, raises $44M

Posted by on 6 February, 2019

This post was originally published on this site

As more organizations move to cloud-based IT architectures, a startup that’s helping them secure that data in an efficient way has raised some capital. vArmour, which provides a platform to help manage security policies across disparate public and private cloud environments in one place, is announcing today that it has raised a growth round of $44 million.

The funding is being led by two VCs that specialise in investments into security startups, AllegisCyber and NightDragon.

CEO Tim Eades said that also participating are “two large software companies” as strategic investors that vArmour works with on a regular basis but declined to name them. (You might consider that candidates might include some of the big security vendors in the market, as well as the big cloud services providers, as two possibilities.) This Series E brings the total raised by vArmour to $127 million.

When asked, Eades said that the company would not be disclosing its valuation. That lack of transparency is not uncommon among startups, but perhaps especially should be expected at a business that operated in stealth for the first several years of its life. However, according to PitchBook, vArmour was valued at $420 million when it last raised money, a $41 million round in 2016.

That would put the startup’s valuation at $464 million with this round, if everything is growing at a steady pace, or possibly more if investors are keen to tap into what appears to be a growing need.

That need might be summarised like this: we’re seeing a huge migration of IT to cloud-based services, with public cloud services set to grow 17.3 percent in 2019. A large part of those deployments — for companies typically larger than 1,000 people — are spread across multiple private and public clouds.

This, in turn, has opened a new front in the battle to secure data. “We believe that hybrid cloud security is a market valued somewhere between $6 billion and $8 billion at the moment,” said Eades.

Many organizations are storing information and apps across multiple locations — between seven and eight data centers on average for, say, a typical bank, Eades said — and while that may help them hedge bets, save money and reach some efficiencies, but the lack of cohesion also opens to door to security loopholes.

“Organizations are deploying multiple clouds for business agility and reduced cost, but the rapid adoption is making it a nightmare for security and IT pros to provide consistent security controls across cloud platforms,” said Bob Ackerman, Founder and Managing Director at AllegisCyber, in a statement. “vArmour is already servicing this need with hundreds of customers, and we’re excited to help vArmour grow to the next stage of development.”

vArmour is among the companies — Cisco and others are also competing with it — that are providing a platform to take something that is somewhat messy — disparate security policies covering disparate containers and apps — and handle it in a more cohesive and neat way by providing a single way to manage and provision compliance and policies across all of them. This not only helps to manage the data but potentially can help halt a breach by letting an organization put a stop in place across multiple environments.

“From my experience, this is an important solution for the cloud security space,” said Dave DeWalt, founder of NightDragon, in a statement. “With security teams now having to manage a multitude of cloud estates and inundated with regulatory mandates, they need a simple solution that’s capable of continuous compliance. We haven’t seen anyone else do this as well as vArmour.”

Eades said that the big change in the last couple of years for vArmour is that, as cloud services have grown in popularity, it has been putting in place a self-service version of the main product, which it sells as the vArmour Application Controller, aimed at smaller organizations. It’s also been leaning heavily on channel partners (Telstra, which led its last round, is one strategic of this kind) to help with the heavy lifting of sales.

vArmour isn’t disclosing revenues or how many customers it has at the moment, but Eades said that it’s been growing at 100 percent each year for the last two. At this rate, he says that plan will be to take the company public in the next couple of years.

Posted Under: Tech News
Page 3 of 7812345...102030...Last »

Social Media

Bulk Deals

Subscribe for exclusive Deals

Recent Post

Archives

Facebook

Twitter

Subscribe for exclusive Deals




Copyright 2015 - InnovatePC - All Rights Reserved

Site Design By Digital web avenue